SlideShare a Scribd company logo
1 of 5
Download to read offline
Computer Vision Based Human-Computer
Interaction Using Color Detection Techniques
Chetan Dhule
Computer Science & Engineering Department
G.H Raisoni College Of Engineering
Nagpur,India
chetandhule123@gmail.com
Trupti Nagrare
Computer Science & Engineering Department
G.H Raisoni College Of Engineering
Nagpur,India
trupti.nagrare@raisoni.net
Abstract— A gesture-based human computer interaction
allows people to control the application on windows by moving
their hands through the air and make computers and devices
easier to use. Existing solutions have relied on gesture
recognition algorithms they needs different hardwares, often
involving complicated setups limited to the research lab.
Algorithms which are used so far for gesture recognition are
not practical or responsive enough for real-world use, might be
due to the inadequate data on which the image processing is
done. As existing methods are based on gesture recognition
algorithms. It needs ‘ANN training’ which makes whole
process slow and reduce accuracy. Method we proposed is
based on real time controlling the motion of mouse in windows
according to the motion of hand and fingers by calculating the
change in pixels values of RBG colors from a video, ‘without
using any ANN training’ to get exact sequence of motion of
hands and fingers.
Keywords- computer vision, gesture recognition, speech
recognition, human computer interaction
I. INTRODUCTION
Existing solutions have relied on gesture recognition
algorithms they needs exotic hardware, often involving
elaborate setups limited to the research lab. Existing gesture
recognition algorithms are not so much efficient or practical
enough for real-world use, might be due to image
processing is applied on inadequate data. As existing
methods are based on gesture recognition algorithms. It
needs ‘ANN training’ which makes whole process slow and
reduce accuracy. The main objective of Method we
proposed is based on real time controlling the motion of
mouse in windows according to the motion of hand and
fingers by calculating the change in pixels values of RBG
colors from a video, ‘without using any ANN training’ to
get exact sequence of motion of hands and fingers.
II. PROBLEM DEFINATION
Unfortunately, most of current approaches which are based
on gesture recognition have several shortcomings. Some of
them has required bulky hardware and users needs to wear
multiple sensors and stand near multiple calibrated cameras
for processing gestures. Most of the cameras used for
capturing the image use color data ,so they are sensitive to
environmental factors such as dynamic backgrounds and
lighting conditions. The algorithms which are used to
identify the gestures from the data supplied by the
hardware like cameras have been unreliable when they are
applied on number of users during the process of testing.
Most of current approaches work by using recognition
algorithms. Since the difference between time needed for
the computer to recognize a gesture and time needed to
display its result is usually longer, so there is always a lag
and it affect the speed of application and makes it slow.
Also all this relay on specific pre-fixed set of gesture.
Finally, there is not any workspace or environments that
allow users to freely use gestures for completing tasks such
as controlling motion and events of mouse and which is
easy to use.
III. OBJECTIVES
Existing solutions have relied on gesture recognition
algorithms they needs exotic hardware like multiple sensors
to wear on hand in the form of gloves to track motion of
mouse coordinates and many times needs to stand near
multiple calibrated cameras, often involving elaborate
setups limited to the research lab. Existing Gesture
recognition algorithms used are not that much practical or
efficient enough for real-world use,slightily due to the
inadequate data on which the image processing is applied.
As existing methods are based on gesture recognition
algorithms. It needs ‘ANN training’ which makes whole
process slow and reduce accuracy. The main objective of
Method we proposed is based on real time controlling the
motion of mouse in windows according to the motion of
hand and fingers by calculating the change in pixels values
of RBG colors from a video, ‘without using any ANN
training’ to get exact sequence of motion of hands and
fingers.
IV. LITERATURE REVIEW
The existing literature briefly explain the processing of
hand gestures. Earlier work by Freeman and Weissman [1]
2014 Fourth International Conference on Communication Systems and Network Technologies
978-1-4799-3070-8/14 $31.00 © 2014 IEEE
DOI 10.1109/CSNT.2014.192
934
2014 Fourth International Conference on Communication Systems and Network Technologies
978-1-4799-3070-8/14 $31.00 © 2014 IEEE
DOI 10.1109/CSNT.2014.192
934
allow the user to control a television set by using a video
camera and computer vision template matching algorithms to
detect a user's hand from across a room. In this approach a
user could show an open hand and an on-screen hand icon
would appear which could be used to adjust various
graphical controls, such as a volume slider. To activate slider
the user needs to cover the control for a fixed amount of
time. By this approach users enjoyed this alternative to the
physical remote control and that the feedback of the on-
screen hand was effective in assisting the user. However, to
activate the different controls users needs to hold their hand
up for long amounts of time, so it is tiring of user. Same type
of problem of user fatigue is common in case of the one of
the gesture-based interfaces called gorilla arm.
Other approaches works by using multiple cameras to detect
and track hand motion by producing a 3D image [2][4]. As
these systems are using multiple cameras so it required
careful installation process as calibration parameters such as
the distance between the cameras was important in the
triangulation algorithms used. Since a large amount of video
data needed to be processed in real-time these algorithms
proves computationally expensive and stereo-matching
typically fails on scenes with little or no texture. Ultimately,
it is not possible to use such systems outside of their special
lab environments. In [3] Pranav Mistry presented the Sixth
Sense wearable gestural interface, which used a camera and
projector worn on the user's chest to allow the user to zoom
in on projected maps(among other activities) by the use of
two-handed gestures. In order for the camera to detect the
user's hand, the user had to wear brightly-colored markers on
their index fingers and thumbs. The regular webcam worn by
the user would also be sensitive to environmental conditions
such as bright sunlight or darkness, which would makes it
difficult to recognize color marker. Wilson and Oliver [5]
aimed to create G Windows which is a Minority Report-like
environment . By pointing with their hand and using voice
commands the user was able to move an on-screen cursor of
a Microsoft Windows desktop to trigger actions like "close"
and "scroll" to affect the underlying application windows.
They concluded that users preferred interacting with hand
gestures over voice commands and if desktop workspaces
designed for gesture interactions so it would be more
profitable in further. When considering online workspaces,
several commercial and academic web-based collaboration
solutions have existed for some time. However, there are
limitations like interaction with other users in these
environments is usually limited to basic sharing of media
files, rather than allowing for full real-time collaboration of
entire web-based applications and their data between users
on distinctly deployed domains, as this paper proposes.
Cristian Gadea, Bogdan Ionescu [6] aimed to create Finger-
Based Gesture Control of a Collaborative Online Workspace,
but system needs continuous internet connectivity, but this is
not possible always in India. It needs an online workspace
called as UC-IC, the application is within web browser to
determine latest hand gesture, but it is not possible always to
provide all time high speed connectivity everywhere and
every time. Beside this it needs the training to recognize
gesture, it slows down the system. In [7,8,9] methods are
based on gesture recognition algorithms. It needs ‘ANN
training’ which makes whole process slow and reduce
accuracy. Because each time if we are trying to recognize the
gesture so the ANN training will be needed, and much of
time will be needed. So system will not work or can’t match
its output speed with exact motion of mouse pointer.
V. SYSTEM ARCHITECTURE
In this system we have used different preprocessing
techniques, feature extraction a tool for recognizing the
pixel based values or coordinates of RBG color by tracking
the change in pixel position of different color stickers
attached at fingers of user in real time. So accordingly the
new updated values will be sent to PC to track motion of
mouse.
Figure 1: Block diagram of the different phases of the system.
A. Video Capturing: Here continuous video will be
given as an input toby our system to the laptop.
B. Image Processing: Image segmentation is done under
two phases:
1. Skin Detection Model: To detect hand and fingers
from image.
2. Approximate Median model : For subtraction of
background.It has been observed that by the use of
both methods for segmentation was obtained much
better for further process.
C. Pixel Extraction: In this phase we will get pixel
sequence from image ‘without using any ANN training’
to get exact sequence of motion of hands and fingers.
D. Color Detection : In this phase we will extract color
positions of RGB color from pixel sequence to detect the
motion of hand and fingures by calculating change in
pixel values of RBG colors.
935935
E. Controlling Position of mouse Pointer: Send signals to
system to control mouse pointer motion and mouse
events. It will give an appropriate command to PC to
display the motion of mouse pointer according to motion
of users fingers or hand.
VI. TECHNIQUES FOR FOR PIXEL AND COLOR
DETECTION
A. Video Capturing
1) Loading Drivers
System may have multiple web camera. It needs camera
driver.Each Driver has a unique ID.Use
“capGetDriverDescription” function which return
Name of driver and the ID of driver.
2) Capturing
To capture camera view:
obj=capCreateCaptureWindow();
To start showing camera view in picture box in our
s/w: sendmesage(connect,obj);
B. Processing frames of video
We cant process the video directly ,so we need to convert
video into image by function: picture=hdcToPicture(obj);
Suppose camera of 16MP and fps value=45 (Frames per
second).So we will need to process 45 images per second.
To get detail pixel RGB (RED,GREEN,BLUE) values use
function “ getBitmapBits() “
C. Getting Pixel Color:
Figure 2. Getting Pixel Color
D. Scanning
Figure 3. Scanning pixel wise horizontally in x and come
back vertically in y direction
E. Algorithm for pixel and color detection
Figure 4: Algorithm for pixel and color detection
X- x-coordinate of pixel in image.
Y- y-coordinate of pixel in image.
R-Red
B-Blue
G-Green
VII. METHODOLOGY
A. Hand Position tracking and mouse control
Figure 5. Hand Position tracking and mouse control
Getting user input virtually is the main aim for this module
where user will move his finger in front of camera capture
area. This motion will capture and detected by the camera
and processed by the system frame by frame. After
processing system will try to get the finger co-ordinates and
once co-ordinates get calculated it will operate the cursor
position.
B. Laser Pointer Detection
Figure 6. Laser Pointer Detection
936936
C. Hand Gesture Based Auto Image Grabbing: (virtual
Zoom in/out)
Figure7. Virtual Zoom in/out
D. Camera Processing and image capturing:
Figure8. Camera Processing and image capturing
E. Virtual Sense for file handling.
This system will make use of the virtual sense technology in
order to copy a file from one system into another within a
local area network LAN/Wi-Fi. The user will make an
action of picking upon the file that needs to be copied and
then move it to the system where the file would be copied
and then release it over that system.
VIII. RESULTS AND DISCUSSION
The software has provision to control all clicking events of
mouse by using a color marker. . After several experiments,
it was observed that use of red color marker are more
effective in comparison with when other color markers are
used.
Figure9.Graphical user Interface of application .
Figure10.Start camera
Figure11. Set the marker color.
937937
Figure12.Control motion and clicking events of mouse with
the color marker set earlier
IX. CONCLUSION
This project can be very useful for people who want to
control computer without actually touching to system or by
using wireless mouse which needs always a platform to
operate. The accuracy is more when we are using red color
marker in comparison to the case when other color markers
were used individually. The problem of changing lighting
condition and color based recognition has been solved in
this work by giving the button to set the marker color at
starting phase of application. Still there are some problems
while recognition speed , where speed of controlling motion
of mouse is not 100% which need to be improved for some
of the gestures. All mouse movement and keys action has
already been mapped that is working well under given
circumstances. As a part of future scope the application can
be improved to work with mobile phone and play stations.
Other mode of human computer interaction like voice
recognition, facial expression, eye gaze, etc. can also be
combined to make the system more robust and flexible.
ACKNOWLEDGMENT
I want to thank all subjects participating in our experiments,
my guide for her valuable guidance, advice and help
provided during this project. And finally I will thank to my
parents for their encouragement.
REFERENCES
[1] W. T. Freeman and C. D. Weissman, "Television Control by
Hand-Gestures", in proc. of international. Workshop on
Automatic Face and Gesture Recognition. IEEE Computer
Society, 1995, pp. 179-183.
[2] Z. Jun, Z. Fangwen, W. Jiaqi, Y. Zhengpeng, and C. Jinbo,
"3D Hand-Gesture Analysis Based on Multi-Criterion in
Multi-Camera Systems”,in ICAL2008 IEEE Int. Conf. on
Automation and Logistics. IEEE Computer Society,
September 2008, pp. 2342-2346.
[3] P. Mistry and P. Maes, "Sixth Sense: A Wearable Gestural
Interface", in ACM SIGGRAPH ASIA 2009 Sketches. New
York, NY, USA: ACM,2009.
[4] A. Utsumi, T. Miyasato, and F. Kishino, " Multi-Camera
Hand Pose Recognition System Using Skeleton Image ", in
RO-MAN'95:Proc. Of 4th IEEE international Workshop on
Robot and Human Communication. IEEE Computer Society,
July 1995, pp. 219-224.
[5] A. Wilson and N. Oliver, "G Windows: Robust Stereo Vision
for Gesture Based Control of Windows", in ICMI03: Proc. of
5th Int. Con! On Multimodal interfaces. New York, NY,
USA: ACM, 2003, pp. 211-218.
[6] Cristian Gadea, BogdanIonescu, Dan Ionescu, Shahidul Islam,
Bogdan Solomon University of Ottawa, Mgestyk
Technologies, “ Finger-Based Gesture Control of a
Collaborative Online Workspace”, 7th IEEE International
Symposium on Applied computational intelligence and
Informatics· May 24-26, 2012 Timisoara, Romania.
[7] Manaram Ganasekera,”Computer Vision Based Hand
movement Capturing System ”, The 8th International
Conference on Computer Science & Education (ICCSE 2013)
April 26-28, 2013. Colombo, Sri Lanka
[8] Fabrizio Lamberti, ”Endowing Existing Desktop Applications
with Customizable Body Gesture-based Interfaces”,IEEE Int’l
Conference on Consumer Electronics(ICCE),978-1-4673-
1363-6, 2013
[9] Anupam Agrawal, Rohit Raj and Shubha Porwal, ”Vision-
based Multimodal Human-Computer Interaction using Hand
and Head Gestures”, Proceedings of 2013 IEEE Conference
on Information and Communication Technologies (ICT 2013)
[10] M. Turk and G.Robertson, “Perceptual user interfaces”,
Communications of the ACM, vol. 43(3), March 2000.
[11] Y. Wu and T. S. Huang, "Vision-Based Gesture Recognition:
A Review", Lecture Notes in Computer Science, Vol.
1739,pp. 103-115,1999
938938

More Related Content

What's hot

Design of Image Projection Using Combined Approach for Tracking
Design of Image Projection Using Combined Approach for  TrackingDesign of Image Projection Using Combined Approach for  Tracking
Design of Image Projection Using Combined Approach for TrackingIJMER
 
Final Year Project-Gesture Based Interaction and Image Processing
Final Year Project-Gesture Based Interaction and Image ProcessingFinal Year Project-Gesture Based Interaction and Image Processing
Final Year Project-Gesture Based Interaction and Image ProcessingSabnam Pandey, MBA
 
User Interfaces and User Centered Design Techniques for Augmented Reality and...
User Interfaces and User Centered Design Techniques for Augmented Reality and...User Interfaces and User Centered Design Techniques for Augmented Reality and...
User Interfaces and User Centered Design Techniques for Augmented Reality and...Stuart Murphy
 
Gesture recognition using artificial neural network,a technology for identify...
Gesture recognition using artificial neural network,a technology for identify...Gesture recognition using artificial neural network,a technology for identify...
Gesture recognition using artificial neural network,a technology for identify...NidhinRaj Saikripa
 
Mems Sensor Based Approach for Gesture Recognition to Control Media in Computer
Mems Sensor Based Approach for Gesture Recognition to Control Media in ComputerMems Sensor Based Approach for Gesture Recognition to Control Media in Computer
Mems Sensor Based Approach for Gesture Recognition to Control Media in ComputerIJARIIT
 
human activity recognization using machine learning with data analysis
human activity recognization using machine learning with data analysishuman activity recognization using machine learning with data analysis
human activity recognization using machine learning with data analysisVenkat Projects
 
Development of Sign Signal Translation System Based on Altera’s FPGA DE2 Board
Development of Sign Signal Translation System Based on Altera’s FPGA DE2 BoardDevelopment of Sign Signal Translation System Based on Altera’s FPGA DE2 Board
Development of Sign Signal Translation System Based on Altera’s FPGA DE2 BoardWaqas Tariq
 
IRJET- Finger Gesture Recognition Using Linear Camera
IRJET-  	  Finger Gesture Recognition Using Linear CameraIRJET-  	  Finger Gesture Recognition Using Linear Camera
IRJET- Finger Gesture Recognition Using Linear CameraIRJET Journal
 
28704893 sixth-sense-final-ppt
28704893 sixth-sense-final-ppt28704893 sixth-sense-final-ppt
28704893 sixth-sense-final-ppt3265mn
 
28704893 sixth-sense-final-ppt
28704893 sixth-sense-final-ppt28704893 sixth-sense-final-ppt
28704893 sixth-sense-final-pptvignan university
 
Media Player with Face Detection and Hand Gesture
Media Player with Face Detection and Hand GestureMedia Player with Face Detection and Hand Gesture
Media Player with Face Detection and Hand GestureIRJET Journal
 
Lift using projected coded light for finger tracking and device augmentation
Lift using projected coded light for finger tracking and device augmentationLift using projected coded light for finger tracking and device augmentation
Lift using projected coded light for finger tracking and device augmentationShang Ma
 
Password Based Hand Gesture Controlled Robot
Password Based Hand Gesture Controlled Robot Password Based Hand Gesture Controlled Robot
Password Based Hand Gesture Controlled Robot IJERA Editor
 
Accessing Operating System using Finger Gesture
Accessing Operating System using Finger GestureAccessing Operating System using Finger Gesture
Accessing Operating System using Finger GestureIRJET Journal
 
Human activity recognition
Human activity recognition Human activity recognition
Human activity recognition srikanthgadam
 
Human Activity Recognition using Smartphone's sensor
Human Activity Recognition using Smartphone's sensor Human Activity Recognition using Smartphone's sensor
Human Activity Recognition using Smartphone's sensor Pankaj Mishra
 

What's hot (19)

Design of Image Projection Using Combined Approach for Tracking
Design of Image Projection Using Combined Approach for  TrackingDesign of Image Projection Using Combined Approach for  Tracking
Design of Image Projection Using Combined Approach for Tracking
 
Final Year Project-Gesture Based Interaction and Image Processing
Final Year Project-Gesture Based Interaction and Image ProcessingFinal Year Project-Gesture Based Interaction and Image Processing
Final Year Project-Gesture Based Interaction and Image Processing
 
User Interfaces and User Centered Design Techniques for Augmented Reality and...
User Interfaces and User Centered Design Techniques for Augmented Reality and...User Interfaces and User Centered Design Techniques for Augmented Reality and...
User Interfaces and User Centered Design Techniques for Augmented Reality and...
 
Gesture recognition using artificial neural network,a technology for identify...
Gesture recognition using artificial neural network,a technology for identify...Gesture recognition using artificial neural network,a technology for identify...
Gesture recognition using artificial neural network,a technology for identify...
 
Mems Sensor Based Approach for Gesture Recognition to Control Media in Computer
Mems Sensor Based Approach for Gesture Recognition to Control Media in ComputerMems Sensor Based Approach for Gesture Recognition to Control Media in Computer
Mems Sensor Based Approach for Gesture Recognition to Control Media in Computer
 
Sixth sense
Sixth senseSixth sense
Sixth sense
 
human activity recognization using machine learning with data analysis
human activity recognization using machine learning with data analysishuman activity recognization using machine learning with data analysis
human activity recognization using machine learning with data analysis
 
Niknewppt
NiknewpptNiknewppt
Niknewppt
 
Development of Sign Signal Translation System Based on Altera’s FPGA DE2 Board
Development of Sign Signal Translation System Based on Altera’s FPGA DE2 BoardDevelopment of Sign Signal Translation System Based on Altera’s FPGA DE2 Board
Development of Sign Signal Translation System Based on Altera’s FPGA DE2 Board
 
IRJET- Finger Gesture Recognition Using Linear Camera
IRJET-  	  Finger Gesture Recognition Using Linear CameraIRJET-  	  Finger Gesture Recognition Using Linear Camera
IRJET- Finger Gesture Recognition Using Linear Camera
 
28704893 sixth-sense-final-ppt
28704893 sixth-sense-final-ppt28704893 sixth-sense-final-ppt
28704893 sixth-sense-final-ppt
 
28704893 sixth-sense-final-ppt
28704893 sixth-sense-final-ppt28704893 sixth-sense-final-ppt
28704893 sixth-sense-final-ppt
 
Nikppt
NikpptNikppt
Nikppt
 
Media Player with Face Detection and Hand Gesture
Media Player with Face Detection and Hand GestureMedia Player with Face Detection and Hand Gesture
Media Player with Face Detection and Hand Gesture
 
Lift using projected coded light for finger tracking and device augmentation
Lift using projected coded light for finger tracking and device augmentationLift using projected coded light for finger tracking and device augmentation
Lift using projected coded light for finger tracking and device augmentation
 
Password Based Hand Gesture Controlled Robot
Password Based Hand Gesture Controlled Robot Password Based Hand Gesture Controlled Robot
Password Based Hand Gesture Controlled Robot
 
Accessing Operating System using Finger Gesture
Accessing Operating System using Finger GestureAccessing Operating System using Finger Gesture
Accessing Operating System using Finger Gesture
 
Human activity recognition
Human activity recognition Human activity recognition
Human activity recognition
 
Human Activity Recognition using Smartphone's sensor
Human Activity Recognition using Smartphone's sensor Human Activity Recognition using Smartphone's sensor
Human Activity Recognition using Smartphone's sensor
 

Viewers also liked

Ubitous computing ppt
Ubitous computing pptUbitous computing ppt
Ubitous computing pptjolly9293
 
Multimodal man machine interaction
Multimodal man machine interactionMultimodal man machine interaction
Multimodal man machine interactionDr. Rajesh P Barnwal
 
Docxpresso legal documents online eIDAS
Docxpresso legal documents online eIDASDocxpresso legal documents online eIDAS
Docxpresso legal documents online eIDASLink to WhatsApp
 
Palestra campo grande_(4)
Palestra campo grande_(4)Palestra campo grande_(4)
Palestra campo grande_(4)coisadevelho
 
Brentford pocket-by-brisbane-property-developer-ar-developments-brochure
Brentford pocket-by-brisbane-property-developer-ar-developments-brochureBrentford pocket-by-brisbane-property-developer-ar-developments-brochure
Brentford pocket-by-brisbane-property-developer-ar-developments-brochureMick Millan
 
Some Changes madet to our opening sequence
Some Changes madet to our opening sequenceSome Changes madet to our opening sequence
Some Changes madet to our opening sequencejodiefoster96
 
Hombro doloroso
Hombro dolorosoHombro doloroso
Hombro dolorosofreefallen
 
опрос велосипедистов в сокольниках
опрос велосипедистов в сокольникахопрос велосипедистов в сокольниках
опрос велосипедистов в сокольникахstassia8
 
Tips for Building Strong Relationships
Tips for Building Strong RelationshipsTips for Building Strong Relationships
Tips for Building Strong RelationshipsPhilip H. Levy
 
Casimir fishing trip
Casimir fishing tripCasimir fishing trip
Casimir fishing tripjinmartini
 
Forms & conventions for documentary jodie
Forms & conventions for documentary jodieForms & conventions for documentary jodie
Forms & conventions for documentary jodiejodiefoster96
 
RT Overview Rus 2016
RT Overview Rus 2016RT Overview Rus 2016
RT Overview Rus 2016RT TV Channel
 

Viewers also liked (20)

Ubitous computing ppt
Ubitous computing pptUbitous computing ppt
Ubitous computing ppt
 
Multimodal man machine interaction
Multimodal man machine interactionMultimodal man machine interaction
Multimodal man machine interaction
 
Docxpresso legal documents online eIDAS
Docxpresso legal documents online eIDASDocxpresso legal documents online eIDAS
Docxpresso legal documents online eIDAS
 
Dasar kelistrikan
Dasar kelistrikanDasar kelistrikan
Dasar kelistrikan
 
Serena fasolo es. 4
Serena fasolo es. 4Serena fasolo es. 4
Serena fasolo es. 4
 
Palestra campo grande_(4)
Palestra campo grande_(4)Palestra campo grande_(4)
Palestra campo grande_(4)
 
Brentford pocket-by-brisbane-property-developer-ar-developments-brochure
Brentford pocket-by-brisbane-property-developer-ar-developments-brochureBrentford pocket-by-brisbane-property-developer-ar-developments-brochure
Brentford pocket-by-brisbane-property-developer-ar-developments-brochure
 
Pdf 1
Pdf 1Pdf 1
Pdf 1
 
Some Changes madet to our opening sequence
Some Changes madet to our opening sequenceSome Changes madet to our opening sequence
Some Changes madet to our opening sequence
 
Air Pollution
Air PollutionAir Pollution
Air Pollution
 
Hombro doloroso
Hombro dolorosoHombro doloroso
Hombro doloroso
 
опрос велосипедистов в сокольниках
опрос велосипедистов в сокольникахопрос велосипедистов в сокольниках
опрос велосипедистов в сокольниках
 
Tips for Building Strong Relationships
Tips for Building Strong RelationshipsTips for Building Strong Relationships
Tips for Building Strong Relationships
 
Casimir fishing trip
Casimir fishing tripCasimir fishing trip
Casimir fishing trip
 
Media study.
Media study.Media study.
Media study.
 
Assignment 3
Assignment 3Assignment 3
Assignment 3
 
Evaluation
EvaluationEvaluation
Evaluation
 
Cms made simple
Cms made simpleCms made simple
Cms made simple
 
Forms & conventions for documentary jodie
Forms & conventions for documentary jodieForms & conventions for documentary jodie
Forms & conventions for documentary jodie
 
RT Overview Rus 2016
RT Overview Rus 2016RT Overview Rus 2016
RT Overview Rus 2016
 

Similar to Computer Vision Based Human-Computer Interaction Using Color Detection

A Survey on Detecting Hand Gesture
A Survey on Detecting Hand GestureA Survey on Detecting Hand Gesture
A Survey on Detecting Hand GestureIRJET Journal
 
Virtual Mouse Control Using Hand Gesture Recognition
Virtual Mouse Control Using Hand Gesture RecognitionVirtual Mouse Control Using Hand Gesture Recognition
Virtual Mouse Control Using Hand Gesture RecognitionIRJET Journal
 
virtual mouse using hand gesture.pptx
virtual mouse using hand gesture.pptxvirtual mouse using hand gesture.pptx
virtual mouse using hand gesture.pptxsivaeswarreddy
 
Controlling Computer using Hand Gestures
Controlling Computer using Hand GesturesControlling Computer using Hand Gestures
Controlling Computer using Hand GesturesIRJET Journal
 
Sign Language Recognition using Machine Learning
Sign Language Recognition using Machine LearningSign Language Recognition using Machine Learning
Sign Language Recognition using Machine LearningIRJET Journal
 
IRJET- Convenience Improvement for Graphical Interface using Gesture Dete...
IRJET-  	  Convenience Improvement for Graphical Interface using Gesture Dete...IRJET-  	  Convenience Improvement for Graphical Interface using Gesture Dete...
IRJET- Convenience Improvement for Graphical Interface using Gesture Dete...IRJET Journal
 
IRJET- Sign Language Interpreter
IRJET- Sign Language InterpreterIRJET- Sign Language Interpreter
IRJET- Sign Language InterpreterIRJET Journal
 
Cursor movement by hand gesture.pptx
Cursor movement by hand gesture.pptxCursor movement by hand gesture.pptx
Cursor movement by hand gesture.pptxRastogiAman
 
VIRTUAL PAINT APPLICATION USING HAND GESTURES
VIRTUAL PAINT APPLICATION USING HAND GESTURESVIRTUAL PAINT APPLICATION USING HAND GESTURES
VIRTUAL PAINT APPLICATION USING HAND GESTURESIRJET Journal
 
HAND GESTURE CONTROLLED MOUSE
HAND GESTURE CONTROLLED MOUSEHAND GESTURE CONTROLLED MOUSE
HAND GESTURE CONTROLLED MOUSEIRJET Journal
 
A Survey Paper on Controlling Computer using Hand Gestures
A Survey Paper on Controlling Computer using Hand GesturesA Survey Paper on Controlling Computer using Hand Gestures
A Survey Paper on Controlling Computer using Hand GesturesIRJET Journal
 
Real Time Head & Hand Tracking Using 2.5D Data
Real Time Head & Hand Tracking Using 2.5D Data Real Time Head & Hand Tracking Using 2.5D Data
Real Time Head & Hand Tracking Using 2.5D Data Harin Veera
 
VIRTUAL MOUSE USING OPENCV
VIRTUAL MOUSE USING OPENCVVIRTUAL MOUSE USING OPENCV
VIRTUAL MOUSE USING OPENCVIRJET Journal
 
Research on Detecting Hand Gesture
Research on Detecting Hand GestureResearch on Detecting Hand Gesture
Research on Detecting Hand GestureIRJET Journal
 
Virtual Mouse Control Using Hand Gestures
Virtual Mouse Control Using Hand GesturesVirtual Mouse Control Using Hand Gestures
Virtual Mouse Control Using Hand GesturesIRJET Journal
 
Smart Presentation Control by Hand Gestures Using Computer Vision and Google’...
Smart Presentation Control by Hand Gestures Using Computer Vision and Google’...Smart Presentation Control by Hand Gestures Using Computer Vision and Google’...
Smart Presentation Control by Hand Gestures Using Computer Vision and Google’...IRJET Journal
 
Controlling Mouse Movements Using hand Gesture And X box 360
Controlling Mouse Movements Using hand Gesture And X box 360Controlling Mouse Movements Using hand Gesture And X box 360
Controlling Mouse Movements Using hand Gesture And X box 360IRJET Journal
 
HAND GESTURE RECOGNITION.ppt (1).pptx
HAND GESTURE RECOGNITION.ppt (1).pptxHAND GESTURE RECOGNITION.ppt (1).pptx
HAND GESTURE RECOGNITION.ppt (1).pptxDeepakkumaragrahari1
 

Similar to Computer Vision Based Human-Computer Interaction Using Color Detection (20)

A Survey on Detecting Hand Gesture
A Survey on Detecting Hand GestureA Survey on Detecting Hand Gesture
A Survey on Detecting Hand Gesture
 
Virtual Mouse Control Using Hand Gesture Recognition
Virtual Mouse Control Using Hand Gesture RecognitionVirtual Mouse Control Using Hand Gesture Recognition
Virtual Mouse Control Using Hand Gesture Recognition
 
virtual mouse using hand gesture.pptx
virtual mouse using hand gesture.pptxvirtual mouse using hand gesture.pptx
virtual mouse using hand gesture.pptx
 
Controlling Computer using Hand Gestures
Controlling Computer using Hand GesturesControlling Computer using Hand Gestures
Controlling Computer using Hand Gestures
 
Sign Language Recognition using Machine Learning
Sign Language Recognition using Machine LearningSign Language Recognition using Machine Learning
Sign Language Recognition using Machine Learning
 
IRJET- Convenience Improvement for Graphical Interface using Gesture Dete...
IRJET-  	  Convenience Improvement for Graphical Interface using Gesture Dete...IRJET-  	  Convenience Improvement for Graphical Interface using Gesture Dete...
IRJET- Convenience Improvement for Graphical Interface using Gesture Dete...
 
IRJET- Sign Language Interpreter
IRJET- Sign Language InterpreterIRJET- Sign Language Interpreter
IRJET- Sign Language Interpreter
 
Cursor movement by hand gesture.pptx
Cursor movement by hand gesture.pptxCursor movement by hand gesture.pptx
Cursor movement by hand gesture.pptx
 
VIRTUAL PAINT APPLICATION USING HAND GESTURES
VIRTUAL PAINT APPLICATION USING HAND GESTURESVIRTUAL PAINT APPLICATION USING HAND GESTURES
VIRTUAL PAINT APPLICATION USING HAND GESTURES
 
HAND GESTURE CONTROLLED MOUSE
HAND GESTURE CONTROLLED MOUSEHAND GESTURE CONTROLLED MOUSE
HAND GESTURE CONTROLLED MOUSE
 
G0342039042
G0342039042G0342039042
G0342039042
 
K1802027780
K1802027780K1802027780
K1802027780
 
A Survey Paper on Controlling Computer using Hand Gestures
A Survey Paper on Controlling Computer using Hand GesturesA Survey Paper on Controlling Computer using Hand Gestures
A Survey Paper on Controlling Computer using Hand Gestures
 
Real Time Head & Hand Tracking Using 2.5D Data
Real Time Head & Hand Tracking Using 2.5D Data Real Time Head & Hand Tracking Using 2.5D Data
Real Time Head & Hand Tracking Using 2.5D Data
 
VIRTUAL MOUSE USING OPENCV
VIRTUAL MOUSE USING OPENCVVIRTUAL MOUSE USING OPENCV
VIRTUAL MOUSE USING OPENCV
 
Research on Detecting Hand Gesture
Research on Detecting Hand GestureResearch on Detecting Hand Gesture
Research on Detecting Hand Gesture
 
Virtual Mouse Control Using Hand Gestures
Virtual Mouse Control Using Hand GesturesVirtual Mouse Control Using Hand Gestures
Virtual Mouse Control Using Hand Gestures
 
Smart Presentation Control by Hand Gestures Using Computer Vision and Google’...
Smart Presentation Control by Hand Gestures Using Computer Vision and Google’...Smart Presentation Control by Hand Gestures Using Computer Vision and Google’...
Smart Presentation Control by Hand Gestures Using Computer Vision and Google’...
 
Controlling Mouse Movements Using hand Gesture And X box 360
Controlling Mouse Movements Using hand Gesture And X box 360Controlling Mouse Movements Using hand Gesture And X box 360
Controlling Mouse Movements Using hand Gesture And X box 360
 
HAND GESTURE RECOGNITION.ppt (1).pptx
HAND GESTURE RECOGNITION.ppt (1).pptxHAND GESTURE RECOGNITION.ppt (1).pptx
HAND GESTURE RECOGNITION.ppt (1).pptx
 

Recently uploaded

Electronically Controlled suspensions system .pdf
Electronically Controlled suspensions system .pdfElectronically Controlled suspensions system .pdf
Electronically Controlled suspensions system .pdfme23b1001
 
Gurgaon ✡️9711147426✨Call In girls Gurgaon Sector 51 escort service
Gurgaon ✡️9711147426✨Call In girls Gurgaon Sector 51 escort serviceGurgaon ✡️9711147426✨Call In girls Gurgaon Sector 51 escort service
Gurgaon ✡️9711147426✨Call In girls Gurgaon Sector 51 escort servicejennyeacort
 
main PPT.pptx of girls hostel security using rfid
main PPT.pptx of girls hostel security using rfidmain PPT.pptx of girls hostel security using rfid
main PPT.pptx of girls hostel security using rfidNikhilNagaraju
 
Current Transformer Drawing and GTP for MSETCL
Current Transformer Drawing and GTP for MSETCLCurrent Transformer Drawing and GTP for MSETCL
Current Transformer Drawing and GTP for MSETCLDeelipZope
 
Churning of Butter, Factors affecting .
Churning of Butter, Factors affecting  .Churning of Butter, Factors affecting  .
Churning of Butter, Factors affecting .Satyam Kumar
 
Past, Present and Future of Generative AI
Past, Present and Future of Generative AIPast, Present and Future of Generative AI
Past, Present and Future of Generative AIabhishek36461
 
SPICE PARK APR2024 ( 6,793 SPICE Models )
SPICE PARK APR2024 ( 6,793 SPICE Models )SPICE PARK APR2024 ( 6,793 SPICE Models )
SPICE PARK APR2024 ( 6,793 SPICE Models )Tsuyoshi Horigome
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024Mark Billinghurst
 
INFLUENCE OF NANOSILICA ON THE PROPERTIES OF CONCRETE
INFLUENCE OF NANOSILICA ON THE PROPERTIES OF CONCRETEINFLUENCE OF NANOSILICA ON THE PROPERTIES OF CONCRETE
INFLUENCE OF NANOSILICA ON THE PROPERTIES OF CONCRETEroselinkalist12
 
What are the advantages and disadvantages of membrane structures.pptx
What are the advantages and disadvantages of membrane structures.pptxWhat are the advantages and disadvantages of membrane structures.pptx
What are the advantages and disadvantages of membrane structures.pptxwendy cai
 
CCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdf
CCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdfCCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdf
CCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdfAsst.prof M.Gokilavani
 
Concrete Mix Design - IS 10262-2019 - .pptx
Concrete Mix Design - IS 10262-2019 - .pptxConcrete Mix Design - IS 10262-2019 - .pptx
Concrete Mix Design - IS 10262-2019 - .pptxKartikeyaDwivedi3
 
Call Girls Narol 7397865700 Independent Call Girls
Call Girls Narol 7397865700 Independent Call GirlsCall Girls Narol 7397865700 Independent Call Girls
Call Girls Narol 7397865700 Independent Call Girlsssuser7cb4ff
 
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube ExchangerStudy on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube ExchangerAnamika Sarkar
 
HARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IVHARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IVRajaP95
 
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptxDecoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptxJoão Esperancinha
 
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICSAPPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICSKurinjimalarL3
 

Recently uploaded (20)

Electronically Controlled suspensions system .pdf
Electronically Controlled suspensions system .pdfElectronically Controlled suspensions system .pdf
Electronically Controlled suspensions system .pdf
 
Gurgaon ✡️9711147426✨Call In girls Gurgaon Sector 51 escort service
Gurgaon ✡️9711147426✨Call In girls Gurgaon Sector 51 escort serviceGurgaon ✡️9711147426✨Call In girls Gurgaon Sector 51 escort service
Gurgaon ✡️9711147426✨Call In girls Gurgaon Sector 51 escort service
 
main PPT.pptx of girls hostel security using rfid
main PPT.pptx of girls hostel security using rfidmain PPT.pptx of girls hostel security using rfid
main PPT.pptx of girls hostel security using rfid
 
Current Transformer Drawing and GTP for MSETCL
Current Transformer Drawing and GTP for MSETCLCurrent Transformer Drawing and GTP for MSETCL
Current Transformer Drawing and GTP for MSETCL
 
Churning of Butter, Factors affecting .
Churning of Butter, Factors affecting  .Churning of Butter, Factors affecting  .
Churning of Butter, Factors affecting .
 
Past, Present and Future of Generative AI
Past, Present and Future of Generative AIPast, Present and Future of Generative AI
Past, Present and Future of Generative AI
 
SPICE PARK APR2024 ( 6,793 SPICE Models )
SPICE PARK APR2024 ( 6,793 SPICE Models )SPICE PARK APR2024 ( 6,793 SPICE Models )
SPICE PARK APR2024 ( 6,793 SPICE Models )
 
★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR
★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR
★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR
 
POWER SYSTEMS-1 Complete notes examples
POWER SYSTEMS-1 Complete notes  examplesPOWER SYSTEMS-1 Complete notes  examples
POWER SYSTEMS-1 Complete notes examples
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024
 
INFLUENCE OF NANOSILICA ON THE PROPERTIES OF CONCRETE
INFLUENCE OF NANOSILICA ON THE PROPERTIES OF CONCRETEINFLUENCE OF NANOSILICA ON THE PROPERTIES OF CONCRETE
INFLUENCE OF NANOSILICA ON THE PROPERTIES OF CONCRETE
 
What are the advantages and disadvantages of membrane structures.pptx
What are the advantages and disadvantages of membrane structures.pptxWhat are the advantages and disadvantages of membrane structures.pptx
What are the advantages and disadvantages of membrane structures.pptx
 
CCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdf
CCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdfCCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdf
CCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdf
 
Concrete Mix Design - IS 10262-2019 - .pptx
Concrete Mix Design - IS 10262-2019 - .pptxConcrete Mix Design - IS 10262-2019 - .pptx
Concrete Mix Design - IS 10262-2019 - .pptx
 
Call Girls Narol 7397865700 Independent Call Girls
Call Girls Narol 7397865700 Independent Call GirlsCall Girls Narol 7397865700 Independent Call Girls
Call Girls Narol 7397865700 Independent Call Girls
 
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube ExchangerStudy on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
 
young call girls in Rajiv Chowk🔝 9953056974 🔝 Delhi escort Service
young call girls in Rajiv Chowk🔝 9953056974 🔝 Delhi escort Serviceyoung call girls in Rajiv Chowk🔝 9953056974 🔝 Delhi escort Service
young call girls in Rajiv Chowk🔝 9953056974 🔝 Delhi escort Service
 
HARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IVHARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IV
 
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptxDecoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
 
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICSAPPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
 

Computer Vision Based Human-Computer Interaction Using Color Detection

  • 1. Computer Vision Based Human-Computer Interaction Using Color Detection Techniques Chetan Dhule Computer Science & Engineering Department G.H Raisoni College Of Engineering Nagpur,India chetandhule123@gmail.com Trupti Nagrare Computer Science & Engineering Department G.H Raisoni College Of Engineering Nagpur,India trupti.nagrare@raisoni.net Abstract— A gesture-based human computer interaction allows people to control the application on windows by moving their hands through the air and make computers and devices easier to use. Existing solutions have relied on gesture recognition algorithms they needs different hardwares, often involving complicated setups limited to the research lab. Algorithms which are used so far for gesture recognition are not practical or responsive enough for real-world use, might be due to the inadequate data on which the image processing is done. As existing methods are based on gesture recognition algorithms. It needs ‘ANN training’ which makes whole process slow and reduce accuracy. Method we proposed is based on real time controlling the motion of mouse in windows according to the motion of hand and fingers by calculating the change in pixels values of RBG colors from a video, ‘without using any ANN training’ to get exact sequence of motion of hands and fingers. Keywords- computer vision, gesture recognition, speech recognition, human computer interaction I. INTRODUCTION Existing solutions have relied on gesture recognition algorithms they needs exotic hardware, often involving elaborate setups limited to the research lab. Existing gesture recognition algorithms are not so much efficient or practical enough for real-world use, might be due to image processing is applied on inadequate data. As existing methods are based on gesture recognition algorithms. It needs ‘ANN training’ which makes whole process slow and reduce accuracy. The main objective of Method we proposed is based on real time controlling the motion of mouse in windows according to the motion of hand and fingers by calculating the change in pixels values of RBG colors from a video, ‘without using any ANN training’ to get exact sequence of motion of hands and fingers. II. PROBLEM DEFINATION Unfortunately, most of current approaches which are based on gesture recognition have several shortcomings. Some of them has required bulky hardware and users needs to wear multiple sensors and stand near multiple calibrated cameras for processing gestures. Most of the cameras used for capturing the image use color data ,so they are sensitive to environmental factors such as dynamic backgrounds and lighting conditions. The algorithms which are used to identify the gestures from the data supplied by the hardware like cameras have been unreliable when they are applied on number of users during the process of testing. Most of current approaches work by using recognition algorithms. Since the difference between time needed for the computer to recognize a gesture and time needed to display its result is usually longer, so there is always a lag and it affect the speed of application and makes it slow. Also all this relay on specific pre-fixed set of gesture. Finally, there is not any workspace or environments that allow users to freely use gestures for completing tasks such as controlling motion and events of mouse and which is easy to use. III. OBJECTIVES Existing solutions have relied on gesture recognition algorithms they needs exotic hardware like multiple sensors to wear on hand in the form of gloves to track motion of mouse coordinates and many times needs to stand near multiple calibrated cameras, often involving elaborate setups limited to the research lab. Existing Gesture recognition algorithms used are not that much practical or efficient enough for real-world use,slightily due to the inadequate data on which the image processing is applied. As existing methods are based on gesture recognition algorithms. It needs ‘ANN training’ which makes whole process slow and reduce accuracy. The main objective of Method we proposed is based on real time controlling the motion of mouse in windows according to the motion of hand and fingers by calculating the change in pixels values of RBG colors from a video, ‘without using any ANN training’ to get exact sequence of motion of hands and fingers. IV. LITERATURE REVIEW The existing literature briefly explain the processing of hand gestures. Earlier work by Freeman and Weissman [1] 2014 Fourth International Conference on Communication Systems and Network Technologies 978-1-4799-3070-8/14 $31.00 © 2014 IEEE DOI 10.1109/CSNT.2014.192 934 2014 Fourth International Conference on Communication Systems and Network Technologies 978-1-4799-3070-8/14 $31.00 © 2014 IEEE DOI 10.1109/CSNT.2014.192 934
  • 2. allow the user to control a television set by using a video camera and computer vision template matching algorithms to detect a user's hand from across a room. In this approach a user could show an open hand and an on-screen hand icon would appear which could be used to adjust various graphical controls, such as a volume slider. To activate slider the user needs to cover the control for a fixed amount of time. By this approach users enjoyed this alternative to the physical remote control and that the feedback of the on- screen hand was effective in assisting the user. However, to activate the different controls users needs to hold their hand up for long amounts of time, so it is tiring of user. Same type of problem of user fatigue is common in case of the one of the gesture-based interfaces called gorilla arm. Other approaches works by using multiple cameras to detect and track hand motion by producing a 3D image [2][4]. As these systems are using multiple cameras so it required careful installation process as calibration parameters such as the distance between the cameras was important in the triangulation algorithms used. Since a large amount of video data needed to be processed in real-time these algorithms proves computationally expensive and stereo-matching typically fails on scenes with little or no texture. Ultimately, it is not possible to use such systems outside of their special lab environments. In [3] Pranav Mistry presented the Sixth Sense wearable gestural interface, which used a camera and projector worn on the user's chest to allow the user to zoom in on projected maps(among other activities) by the use of two-handed gestures. In order for the camera to detect the user's hand, the user had to wear brightly-colored markers on their index fingers and thumbs. The regular webcam worn by the user would also be sensitive to environmental conditions such as bright sunlight or darkness, which would makes it difficult to recognize color marker. Wilson and Oliver [5] aimed to create G Windows which is a Minority Report-like environment . By pointing with their hand and using voice commands the user was able to move an on-screen cursor of a Microsoft Windows desktop to trigger actions like "close" and "scroll" to affect the underlying application windows. They concluded that users preferred interacting with hand gestures over voice commands and if desktop workspaces designed for gesture interactions so it would be more profitable in further. When considering online workspaces, several commercial and academic web-based collaboration solutions have existed for some time. However, there are limitations like interaction with other users in these environments is usually limited to basic sharing of media files, rather than allowing for full real-time collaboration of entire web-based applications and their data between users on distinctly deployed domains, as this paper proposes. Cristian Gadea, Bogdan Ionescu [6] aimed to create Finger- Based Gesture Control of a Collaborative Online Workspace, but system needs continuous internet connectivity, but this is not possible always in India. It needs an online workspace called as UC-IC, the application is within web browser to determine latest hand gesture, but it is not possible always to provide all time high speed connectivity everywhere and every time. Beside this it needs the training to recognize gesture, it slows down the system. In [7,8,9] methods are based on gesture recognition algorithms. It needs ‘ANN training’ which makes whole process slow and reduce accuracy. Because each time if we are trying to recognize the gesture so the ANN training will be needed, and much of time will be needed. So system will not work or can’t match its output speed with exact motion of mouse pointer. V. SYSTEM ARCHITECTURE In this system we have used different preprocessing techniques, feature extraction a tool for recognizing the pixel based values or coordinates of RBG color by tracking the change in pixel position of different color stickers attached at fingers of user in real time. So accordingly the new updated values will be sent to PC to track motion of mouse. Figure 1: Block diagram of the different phases of the system. A. Video Capturing: Here continuous video will be given as an input toby our system to the laptop. B. Image Processing: Image segmentation is done under two phases: 1. Skin Detection Model: To detect hand and fingers from image. 2. Approximate Median model : For subtraction of background.It has been observed that by the use of both methods for segmentation was obtained much better for further process. C. Pixel Extraction: In this phase we will get pixel sequence from image ‘without using any ANN training’ to get exact sequence of motion of hands and fingers. D. Color Detection : In this phase we will extract color positions of RGB color from pixel sequence to detect the motion of hand and fingures by calculating change in pixel values of RBG colors. 935935
  • 3. E. Controlling Position of mouse Pointer: Send signals to system to control mouse pointer motion and mouse events. It will give an appropriate command to PC to display the motion of mouse pointer according to motion of users fingers or hand. VI. TECHNIQUES FOR FOR PIXEL AND COLOR DETECTION A. Video Capturing 1) Loading Drivers System may have multiple web camera. It needs camera driver.Each Driver has a unique ID.Use “capGetDriverDescription” function which return Name of driver and the ID of driver. 2) Capturing To capture camera view: obj=capCreateCaptureWindow(); To start showing camera view in picture box in our s/w: sendmesage(connect,obj); B. Processing frames of video We cant process the video directly ,so we need to convert video into image by function: picture=hdcToPicture(obj); Suppose camera of 16MP and fps value=45 (Frames per second).So we will need to process 45 images per second. To get detail pixel RGB (RED,GREEN,BLUE) values use function “ getBitmapBits() “ C. Getting Pixel Color: Figure 2. Getting Pixel Color D. Scanning Figure 3. Scanning pixel wise horizontally in x and come back vertically in y direction E. Algorithm for pixel and color detection Figure 4: Algorithm for pixel and color detection X- x-coordinate of pixel in image. Y- y-coordinate of pixel in image. R-Red B-Blue G-Green VII. METHODOLOGY A. Hand Position tracking and mouse control Figure 5. Hand Position tracking and mouse control Getting user input virtually is the main aim for this module where user will move his finger in front of camera capture area. This motion will capture and detected by the camera and processed by the system frame by frame. After processing system will try to get the finger co-ordinates and once co-ordinates get calculated it will operate the cursor position. B. Laser Pointer Detection Figure 6. Laser Pointer Detection 936936
  • 4. C. Hand Gesture Based Auto Image Grabbing: (virtual Zoom in/out) Figure7. Virtual Zoom in/out D. Camera Processing and image capturing: Figure8. Camera Processing and image capturing E. Virtual Sense for file handling. This system will make use of the virtual sense technology in order to copy a file from one system into another within a local area network LAN/Wi-Fi. The user will make an action of picking upon the file that needs to be copied and then move it to the system where the file would be copied and then release it over that system. VIII. RESULTS AND DISCUSSION The software has provision to control all clicking events of mouse by using a color marker. . After several experiments, it was observed that use of red color marker are more effective in comparison with when other color markers are used. Figure9.Graphical user Interface of application . Figure10.Start camera Figure11. Set the marker color. 937937
  • 5. Figure12.Control motion and clicking events of mouse with the color marker set earlier IX. CONCLUSION This project can be very useful for people who want to control computer without actually touching to system or by using wireless mouse which needs always a platform to operate. The accuracy is more when we are using red color marker in comparison to the case when other color markers were used individually. The problem of changing lighting condition and color based recognition has been solved in this work by giving the button to set the marker color at starting phase of application. Still there are some problems while recognition speed , where speed of controlling motion of mouse is not 100% which need to be improved for some of the gestures. All mouse movement and keys action has already been mapped that is working well under given circumstances. As a part of future scope the application can be improved to work with mobile phone and play stations. Other mode of human computer interaction like voice recognition, facial expression, eye gaze, etc. can also be combined to make the system more robust and flexible. ACKNOWLEDGMENT I want to thank all subjects participating in our experiments, my guide for her valuable guidance, advice and help provided during this project. And finally I will thank to my parents for their encouragement. REFERENCES [1] W. T. Freeman and C. D. Weissman, "Television Control by Hand-Gestures", in proc. of international. Workshop on Automatic Face and Gesture Recognition. IEEE Computer Society, 1995, pp. 179-183. [2] Z. Jun, Z. Fangwen, W. Jiaqi, Y. Zhengpeng, and C. Jinbo, "3D Hand-Gesture Analysis Based on Multi-Criterion in Multi-Camera Systems”,in ICAL2008 IEEE Int. Conf. on Automation and Logistics. IEEE Computer Society, September 2008, pp. 2342-2346. [3] P. Mistry and P. Maes, "Sixth Sense: A Wearable Gestural Interface", in ACM SIGGRAPH ASIA 2009 Sketches. New York, NY, USA: ACM,2009. [4] A. Utsumi, T. Miyasato, and F. Kishino, " Multi-Camera Hand Pose Recognition System Using Skeleton Image ", in RO-MAN'95:Proc. Of 4th IEEE international Workshop on Robot and Human Communication. IEEE Computer Society, July 1995, pp. 219-224. [5] A. Wilson and N. Oliver, "G Windows: Robust Stereo Vision for Gesture Based Control of Windows", in ICMI03: Proc. of 5th Int. Con! On Multimodal interfaces. New York, NY, USA: ACM, 2003, pp. 211-218. [6] Cristian Gadea, BogdanIonescu, Dan Ionescu, Shahidul Islam, Bogdan Solomon University of Ottawa, Mgestyk Technologies, “ Finger-Based Gesture Control of a Collaborative Online Workspace”, 7th IEEE International Symposium on Applied computational intelligence and Informatics· May 24-26, 2012 Timisoara, Romania. [7] Manaram Ganasekera,”Computer Vision Based Hand movement Capturing System ”, The 8th International Conference on Computer Science & Education (ICCSE 2013) April 26-28, 2013. Colombo, Sri Lanka [8] Fabrizio Lamberti, ”Endowing Existing Desktop Applications with Customizable Body Gesture-based Interfaces”,IEEE Int’l Conference on Consumer Electronics(ICCE),978-1-4673- 1363-6, 2013 [9] Anupam Agrawal, Rohit Raj and Shubha Porwal, ”Vision- based Multimodal Human-Computer Interaction using Hand and Head Gestures”, Proceedings of 2013 IEEE Conference on Information and Communication Technologies (ICT 2013) [10] M. Turk and G.Robertson, “Perceptual user interfaces”, Communications of the ACM, vol. 43(3), March 2000. [11] Y. Wu and T. S. Huang, "Vision-Based Gesture Recognition: A Review", Lecture Notes in Computer Science, Vol. 1739,pp. 103-115,1999 938938