SlideShare a Scribd company logo
1 of 10
Download to read offline
International Journal of Advanced Information Technology (IJAIT) Vol. 2, No.3, June 2012
DOI : 10.5121/ijait.2012.2303 37
Gesture Based Interface Using Motion and Image
Comparison
Shany Jophin1
, Sheethal M.S2
, Priya Philip3
, T M Bhruguram4
Dept of Computer .Science Adi Shankara Institute of Engineering
And Technology, Kalady,
shanyjophin.s@gmail.com
ABSTRACT
This paper gives a new approach for movement of mouse and implementation of its functions using a real
time camera. Here we propose to change the hardware design. Most of the existing technologies mainly
depend on changing the mouse parts features like changing the position of tracking ball and adding more
buttons. We use a camera, colored substance, image comparison technology and motion detection
technology to control mouse movement and implement its functions (right click, left click, scrolling and
double click) .
KEYWORDS
HCI, Sixth Sense, VLCJ
1. INTRODUCTION
Human computing interaction (HCI) is one of the important area of research were people try to
improve the computer technology. Nowadays we find smaller and smaller devices being used to
improve technology. Vision-based gesture and object recognition are another area of research. A
simple interface like embedded keyboard, folder-keyboard and mini-keyboard already exists in
today’s market. However, these interfaces need some amount of space to use and cannot be used
while moving. Touch screen are also globally used which are good control interface and are being
used in many applications. However, touch screens can-not be applied to desktop systems
because of cost and other hardware limitations. By applying vision technology, colored substance,
image comparison technology and controlling the mouse by natural hand gestures, we can reduce
the work space required. In this paper, we propose a novel approach that uses a video device to
control the mouse system properly.
2. RELATED WORK
2.1. Mouse free
Vision-Based Human-Computer Interaction through Real-Time Hand Tracking and Gesture
Recognition Vision-based interaction is an appealing option for replacing primitive human-
computer interaction (HCI) using a mouse or touchpad. We propose a system for using a webcam
to track a users hand and recognize gestures to initiate specific interactions. The contributions of
our work will be to implement a system for hand tracking and simple gesture recognition in real
time [1].
International Journal of Advanced Information Technology (IJAIT) Vol. 2, No.3, June 2012
38
Many researchers in the human computer interaction and robotics fields have tried to control
mouse movement using video devices. However, all of them used different methods to make a
clicking event. One approach, by Erdem et al, used finger tip tracking to control the motion of the
mouse. A click of the mouse button was implemented by defining a screen such that a click
occurred when a user’s hand passed over the region [2, 3]. Another approach was developed by
Chu-Feng Lien [4]. He used only the finger-tips to control the mouse cursor and click. His
clicking method was based on image density, and required the user to hold the mouse cursor on
the desired spot for a short period of time. Paul et al, used still another method to click. They used
the motion of the thumb (from a ‘thumbs-up’ position to a fist) to mark a clicking event thumb.
Movement of the hand while making a special hand sign moved the mouse pointer.
2.2. A Method for Controlling Mouse Movement using a Real-Time Camera
This is a new approach for controlling mouse movement using a real-time camera. Most existing
approaches involve changing mouse parts such as adding more buttons or changing the position
of the tracking ball. Instead, we propose to change the hardware de-sign. Our method is to use a
camera and computer vision technology, such as image segmentation and gesture recognition
.Our method is to use a camera and computer vision technology, such as image segmentation and
gesture recognition, to control mouse tasks (left and right clicking, double-clicking, and scrolling)
and we show how it can perform everything current mouse devices can. This paper shows how to
build this mouse control system [5].
2.3. Sixth Sense
‘SixthSense' is a wearable gestural interface that augments the physical world around us with
digital in-formation and lets us use natural hand gestures to interact with that information. The
SixthSense prototype is comprised of a pocket projector, a mirror and a camera. The hardware
components are coupled in a pendant like mobile wearable device. Both the projector and the
camera are connected to the mobile computing device in the users pocket. The projector projects
visual information enabling surfaces, walls and physical objects around us to be used as
interfaces; while the camera recognizes and tracks user's hand gestures and physical objects using
computer-vision based techniques [6].
2.4. Vlcj
The vlcj project is an Open Source project that pro-vides Java bindings for the excellent vlc media
player from Video LAN. The bindings can be used to build media player client and server
software using Java - everything from simply playing local media les to a full-blown video-on-
demand streaming server is possible .vlcj is being used in diverse applications, helping to provide
video capabilities to software in use on oceanographic research vessels and bespoke IPTV and
home cinema solutions .vlcj is also being used to create software for an Open Source video
camera at Elphel and video mapping for the Open Street Map project [7].
2.5. Mouseless
Mouseless is an invisible computer mouse that provides the familiarity of interaction of a physical
mouse without actually needing a real hardware mouse. The Mouseless invention removes the
requirement of having a physical mouse altogether but still provides the intuitive interaction of a
physical mouse that we are familiar with. Mouseless consists of an Infrared (IR) laser beam (with
line cap) and an Infrared camera. Both IR laser and IR camera are embedded in the computer. The
laser beam module is modified with a line cap and placed such that it creates a plane of IR laser
International Journal of Advanced Information Technology (IJAIT) Vol. 2, No.3, June 2012
39
just above the surface the computer sits on. The user cups their hand, as if a physical mouse was
present underneath, and the laser beam lights up the hand which is in contact with the surface.
The IR camera detects those bright IR blobs using computer vision. The change in the position
and arrangements of these blobs are interpreted as mouse cursor movement and mouse clicks. As
the user moves their hand the cursor on screen moves accordingly. When the user taps their index
finger, the size of the blob changes and the camera recognizes the intended mouse click.[8]
2.6. Real-time Finger Tracking for Interaction
In this work, they described an approach for human finger motion and gesture detection using two
cameras. The target of pointing on a flat monitor or screen is identified using image processing
and line intersection. This is accomplished by processing above and side images of the hand. The
system is able to track the finger movement without building the 3D model of the hand.
Coordinates and movement of the finger in a live video feed can be taken to become the
coordinates and movement of the mouse pointer for human-computer interaction purpose.[9]
3. PROPOSED WORK
Figure 1. The proposed working
3.1. Product and system features
3.1.1. Computer
(Laptop)-Processor: Pentium4, Processor Speed: 1GHz, RAM Capacity: 512 MB, Hard disk:
40GB, Monitor: 15SVGA.
3.1.2. Webcam
It is used for the image processing and also for capturing the image.
International Journal of Advanced Information Technology (IJAIT) Vol. 2, No.3, June 2012
40
Video data format: 12.24-bit RGB, Image resolution: Max 2560*2084, Software enhanced menu
display/sec: 30 in CIF mode, Menu signal bit: 42db, Lens: 6.00mm, Vision: +/-28,Focus range: 3
centimeter to limitless.
3.1.3. Finger Tip
(Red and blue colored substance)- it is used as an alternative for mouse and control the functions
of pointer.
3.1.4. Software Requirements
Operating System: Windows XP, Windows vista, Windows 7.
Code Behind: JAVA (Red Hat or Eclipse).
3.1.5. Internal Interface Requirements
Swing, vlcj.
3.2. Tools used
Laptop (include software like red hat, eclipse, vlcj), webcam, a colored device. The performance
of the software is made to be improved by examining each pixel leaving behind its four
consecutive pixels.
International Journal of Advanced Information Technology (IJAIT) Vol. 2, No.3, June 2012
41
Figure 2. The flow chart
3.3. Functional Requirements
Programmer has to follow a particular sequence of actions to implement their functions. They are
Implementing the functions of vlcj to capture the images given by the user ,Store the captured
images in a temporary storage buffer, Examining pixel( skipping consecutive four pixels ) of the
image captured to find the red color, Finding out the mid points by examining x and y coordinates
of the current area detected, The pointer reaches the position where the midpoint is found,
Examine another pixel containing the red color and find out its midpoint, To move the pointer,
compare the current pixel with previous pixel, If the compared two pixels are the same, then the
pointer starts moving, Perform mouse functions depending on the length detected between the
new computed Pixels.
The User also has certain functions. The functions performed by the user in order are Place a red
colored finger cap or any other red colored substance in front of the webcam, Move the substance
in front of the webcam to see the pointer moving, selecting icons on the screen and doing mouse
functions.
International Journal of Advanced Information Technology (IJAIT) Vol. 2, No.3, June 2012
41
Figure 2. The flow chart
3.3. Functional Requirements
Programmer has to follow a particular sequence of actions to implement their functions. They are
Implementing the functions of vlcj to capture the images given by the user ,Store the captured
images in a temporary storage buffer, Examining pixel( skipping consecutive four pixels ) of the
image captured to find the red color, Finding out the mid points by examining x and y coordinates
of the current area detected, The pointer reaches the position where the midpoint is found,
Examine another pixel containing the red color and find out its midpoint, To move the pointer,
compare the current pixel with previous pixel, If the compared two pixels are the same, then the
pointer starts moving, Perform mouse functions depending on the length detected between the
new computed Pixels.
The User also has certain functions. The functions performed by the user in order are Place a red
colored finger cap or any other red colored substance in front of the webcam, Move the substance
in front of the webcam to see the pointer moving, selecting icons on the screen and doing mouse
functions.
International Journal of Advanced Information Technology (IJAIT) Vol. 2, No.3, June 2012
41
Figure 2. The flow chart
3.3. Functional Requirements
Programmer has to follow a particular sequence of actions to implement their functions. They are
Implementing the functions of vlcj to capture the images given by the user ,Store the captured
images in a temporary storage buffer, Examining pixel( skipping consecutive four pixels ) of the
image captured to find the red color, Finding out the mid points by examining x and y coordinates
of the current area detected, The pointer reaches the position where the midpoint is found,
Examine another pixel containing the red color and find out its midpoint, To move the pointer,
compare the current pixel with previous pixel, If the compared two pixels are the same, then the
pointer starts moving, Perform mouse functions depending on the length detected between the
new computed Pixels.
The User also has certain functions. The functions performed by the user in order are Place a red
colored finger cap or any other red colored substance in front of the webcam, Move the substance
in front of the webcam to see the pointer moving, selecting icons on the screen and doing mouse
functions.
International Journal of Advanced Information Technology (IJAIT) Vol. 2, No.3, June 2012
42
Figure 3. Hand gesture
Figure 4. Hand gesture
Figure 5. Hand gesture
International Journal of Advanced Information Technology (IJAIT) Vol. 2, No.3, June 2012
43
Figure 6. Hand gesture
4. RESULTS
From our implementation and execution of our program we found that the mouse pointer can be
made to move and its functions can be implemented without the use of a touchpad or mouse .The
pointer is moved with the help of our finger gestures by placing the specific color substance in
our hand (any colored cap or any colored small substance) making us easy to use our system
works. The performance of the software has been improved.
Figure 7. Screenshot for mouse pointer functions in windows xp
The Main problem encountered was when the finger shook a lot. Each time the finger position
changed, illumination changed every frame. To fix this problem we added a code to make the
cursor not to move when the previous and current red colored pixel difference is within 4 pixels.
Technologies we use in this paper assume that all hand movement is properly coordinated.
International Journal of Advanced Information Technology (IJAIT) Vol. 2, No.3, June 2012
44
Figure 8. Screenshot for mouse pointer functions in linux
5. FUTURE WORK
There are still many improvements that can be made to our system like improving the
performance of the current system and adding features such as enlarging and shrinking windows,
closing window, etc. by using the palm and multiple fingers. The current system is variant to
reflection and scale changes and requires proper hand gestures, good illumination technology and
powerful camera for the performance of mouse functions. Precision can always be increased at
the cost of recall by adding more stages, but each successive stage takes twice as much time to
find harder negative samples and the applications which benefit from this technology. We present
an image viewing application as an example of where this technology could lead to a more natural
user interface. The same could be said for navigating something like Google Maps or browsing
folders on a screen. However, the applications reach far beyond that. They are particularly
compelling in situations where touch screens are not applicable or less than ideal. For example,
with projection systems, there is no screen to touch. Here vision-based technology would provide
an ideal replacement for touch screen technology. Similarly in public terminals, constant use
results in the spread of dirt and germs. Vision-based systems would remove the need to touch
such setups, and would result in improved interaction.
5.1. Enlarge, zoom in and shrink, zoom out
More experiments were done to implement zooming and shrinking. Different hand gestures were
analyzed to find the proper gesture for zoom and shrink functions. “Moving hand apart” was the
initial hand gesture analyzed but later was discarded due to its limitation of using two hands. The
most convenient hand gesture selected was expansion and contraction movement of thumb and
index finger. Appropriate scaling factor for enlarging and shrinking are chosen. The scaling
International Journal of Advanced Information Technology (IJAIT) Vol. 2, No.3, June 2012
45
across the thumb and index finger is taken for further procedures. The thumb and index finger
movement is done repeatedly to scale the distance to implement these functions.
Figure 9. Diagrammatic representation of zoom and shrink
6. CONCLUSION
We developed a system to control the mouse cursor and implement its function using a real-time
camera. We implemented mouse movement, selection of the icons and its functions like right,
left, double click and scrolling. This system is based on image comparison and motion detection
technology to do mouse pointer movements and selection of icon. However, it is difficult to get
stable results because of the variety of lighting and detection of the same color in any other
location in the background. Most algorithms used have illumination issues. From the results, we
can expect that if the algorithms can work in all environments then our system will work more
efficiently. This system could be useful in presentations and to reduce work space. In the future,
we plan to add more features such as enlarging and shrinking windows, closing window, etc. by
using the palm and multiple fingers. The performance of the software can be only improved by
small percentage due to the lack of a powerful camera and a separate processor for this
application.
REFERENCES
[1] Hojoon Park. A Method for Controlling Mouse Movement using a Real-Time Camera.
ww.cs.brown.edu/research/pubs/theses/masters/2010/park.pdf2010.
[2] Computer vision based mouse, A. Erdem, E. Yardimci, Y. Atalay, V. Cetin, A. E. Acoustics, Speech,
and Signal Processing, 2002. Proceedings. (ICASS). IEEE International Conference.
[3] Vision based Men-Machine Interaction http://www.ceng.metu.edu.tr/~vbi/.
[4] Chu-Feng Lien, Portable Vision-Based HCI - A Real-time Hand Mouse System on Handheld
Devices.
[5] Ben askar Chris Jordan, Hyokwon Lee mouse free. http://www.seas.upenn.edu/cse400/CSE400-2009-
2010/nal-report/Jordan-lee.pdf, 2009-2010.
[6] Pranavmistry,Sixthsense. http://www.youtube.com/watch?v=ZfV4R4x2SK0, 2010. [Online; accessed
28-november-2010].
[7] Mark dot lee Java bindings for the vlc media player. http://www.capricasoftware.co.uk/vlcj/index.php,
2011. [Online; accessed 1-may-2011].
[8] Pranavmistry,Mouseless http://www.pranavmistry.com/projects/mouseless/,2010 [Online; accessed
28-november-2010].
[9] Shaker, N.; Abou Zliekha, M.;Damascus Univ., Damascus, Real-time Finger Tracking for Interaction
http://ieeexplore.ieee.org/search/freesrchabstract.jsp 2007
International Journal of Advanced Information Technology (IJAIT) Vol. 2, No.3, June 2012
46
Authors
Shany Jophin received the BTECH degree from the Department of Computer Science
and Engineering, Government Engineering College, Kannur University, Wayand, in
2011. She is currently doing Masters degree in the Department of Computer Science
and Engineering, Adi Shankara Institute of Engineering and Technology, MG
University, Kerala, India.Her research interests include Cyber forensics, Networking
and Information Security.
Sheethal M.S received the BTECH degree from the Department of Computer Science and
Engineering KMEA engineering college, Edathala, in 2007. She is currently doing
Masters Degree in the Department of Computer Science and Engineering, Adi shankara
institute of engineering and technology, MG university, kerala. Her research interests
include fuzzy clustering and image processing.
Priya Philip received the BE degree from the Department of Computer Science and
Engineering, Sun college of engineering and technology, Nagarcoil , Anna university in
2011. She is currently doing Masters Degree in the Department of Computer Science and
Engineering,, Adi Shankara Institute of Engineering and Technology, MG University,
Kerala, India
T M Bhraguram currently working as assistant professor of Adi Shankara Institute of
Engineering and Technology, Department of information technology, Kerala. He has 4
years of teaching experience in various engineering colleges and industry experience in
Hiranandani Group, Mumbai, India.

More Related Content

What's hot

Six Sense Technology
Six Sense Technology Six Sense Technology
Six Sense Technology vishnu murthy
 
Accessing Operating System using Finger Gesture
Accessing Operating System using Finger GestureAccessing Operating System using Finger Gesture
Accessing Operating System using Finger GestureIRJET Journal
 
Computer vision based human computer interaction using color detection techni...
Computer vision based human computer interaction using color detection techni...Computer vision based human computer interaction using color detection techni...
Computer vision based human computer interaction using color detection techni...Chetan Dhule
 
Rakeshspacemouse
RakeshspacemouseRakeshspacemouse
Rakeshspacemousevirurakesh
 
Gesture Gaming on the World Wide Web Using an Ordinary Web Camera
Gesture Gaming on the World Wide Web Using an Ordinary Web CameraGesture Gaming on the World Wide Web Using an Ordinary Web Camera
Gesture Gaming on the World Wide Web Using an Ordinary Web CameraIJERD Editor
 
Rp2 published
Rp2 publishedRp2 published
Rp2 publishedAman Jain
 
Virtual reality
Virtual realityVirtual reality
Virtual realityMayurpa
 
Markerless motion capture for 3D human model animation using depth camera
Markerless motion capture for 3D human model animation using depth cameraMarkerless motion capture for 3D human model animation using depth camera
Markerless motion capture for 3D human model animation using depth cameraTELKOMNIKA JOURNAL
 
Full Body Immersion in AR
Full Body Immersion in ARFull Body Immersion in AR
Full Body Immersion in ARAli Said
 
Sixth sense technology
Sixth sense technology Sixth sense technology
Sixth sense technology Pulkit Singhal
 
IRJET- 3D Drawing with Augmented Reality
IRJET- 3D Drawing with Augmented RealityIRJET- 3D Drawing with Augmented Reality
IRJET- 3D Drawing with Augmented RealityIRJET Journal
 
Gesture control algorithm for personal computers
Gesture control algorithm for personal computersGesture control algorithm for personal computers
Gesture control algorithm for personal computerseSAT Publishing House
 
Comparative study of augmented reality
Comparative study of augmented realityComparative study of augmented reality
Comparative study of augmented realityijcsa
 
Augmented Reality Map
Augmented Reality MapAugmented Reality Map
Augmented Reality Mapijtsrd
 
Sixth sense technolgy
Sixth sense technolgySixth sense technolgy
Sixth sense technolgyAnvesh Ranga
 

What's hot (18)

Six Sense Technology
Six Sense Technology Six Sense Technology
Six Sense Technology
 
Sixth sense technology
Sixth sense technologySixth sense technology
Sixth sense technology
 
Accessing Operating System using Finger Gesture
Accessing Operating System using Finger GestureAccessing Operating System using Finger Gesture
Accessing Operating System using Finger Gesture
 
Computer vision based human computer interaction using color detection techni...
Computer vision based human computer interaction using color detection techni...Computer vision based human computer interaction using color detection techni...
Computer vision based human computer interaction using color detection techni...
 
Rakeshspacemouse
RakeshspacemouseRakeshspacemouse
Rakeshspacemouse
 
Gesture Gaming on the World Wide Web Using an Ordinary Web Camera
Gesture Gaming on the World Wide Web Using an Ordinary Web CameraGesture Gaming on the World Wide Web Using an Ordinary Web Camera
Gesture Gaming on the World Wide Web Using an Ordinary Web Camera
 
Rp2 published
Rp2 publishedRp2 published
Rp2 published
 
Virtual reality
Virtual realityVirtual reality
Virtual reality
 
Markerless motion capture for 3D human model animation using depth camera
Markerless motion capture for 3D human model animation using depth cameraMarkerless motion capture for 3D human model animation using depth camera
Markerless motion capture for 3D human model animation using depth camera
 
Gesture phones final
Gesture phones  finalGesture phones  final
Gesture phones final
 
Full Body Immersion in AR
Full Body Immersion in ARFull Body Immersion in AR
Full Body Immersion in AR
 
Sixth sense
Sixth senseSixth sense
Sixth sense
 
Sixth sense technology
Sixth sense technology Sixth sense technology
Sixth sense technology
 
IRJET- 3D Drawing with Augmented Reality
IRJET- 3D Drawing with Augmented RealityIRJET- 3D Drawing with Augmented Reality
IRJET- 3D Drawing with Augmented Reality
 
Gesture control algorithm for personal computers
Gesture control algorithm for personal computersGesture control algorithm for personal computers
Gesture control algorithm for personal computers
 
Comparative study of augmented reality
Comparative study of augmented realityComparative study of augmented reality
Comparative study of augmented reality
 
Augmented Reality Map
Augmented Reality MapAugmented Reality Map
Augmented Reality Map
 
Sixth sense technolgy
Sixth sense technolgySixth sense technolgy
Sixth sense technolgy
 

Similar to Gesture control mouse using camera

VIRTUAL MOUSE USING OPENCV
VIRTUAL MOUSE USING OPENCVVIRTUAL MOUSE USING OPENCV
VIRTUAL MOUSE USING OPENCVIRJET Journal
 
Virtual Mouse Control Using Hand Gestures
Virtual Mouse Control Using Hand GesturesVirtual Mouse Control Using Hand Gestures
Virtual Mouse Control Using Hand GesturesIRJET Journal
 
Mouse Simulation Using Two Coloured Tapes
Mouse Simulation Using Two Coloured TapesMouse Simulation Using Two Coloured Tapes
Mouse Simulation Using Two Coloured Tapesijistjournal
 
IRJET- Finger Gesture Recognition Using Linear Camera
IRJET-  	  Finger Gesture Recognition Using Linear CameraIRJET-  	  Finger Gesture Recognition Using Linear Camera
IRJET- Finger Gesture Recognition Using Linear CameraIRJET Journal
 
Virtual Mouse Control Using Hand Gesture Recognition
Virtual Mouse Control Using Hand Gesture RecognitionVirtual Mouse Control Using Hand Gesture Recognition
Virtual Mouse Control Using Hand Gesture RecognitionIRJET Journal
 
A computer vision based virtual mouse
A computer vision based virtual mouseA computer vision based virtual mouse
A computer vision based virtual mouseStudentRocks
 
Controlling Computer using Hand Gestures
Controlling Computer using Hand GesturesControlling Computer using Hand Gestures
Controlling Computer using Hand GesturesIRJET Journal
 
A Survey on Detecting Hand Gesture
A Survey on Detecting Hand GestureA Survey on Detecting Hand Gesture
A Survey on Detecting Hand GestureIRJET Journal
 
This is future-Sixth Sense Technology
This is future-Sixth Sense TechnologyThis is future-Sixth Sense Technology
This is future-Sixth Sense Technologybhavishya1993
 
Virtual Mouse Using Hand Gesture Recognition
Virtual Mouse Using Hand Gesture RecognitionVirtual Mouse Using Hand Gesture Recognition
Virtual Mouse Using Hand Gesture RecognitionIRJET Journal
 
Smart Presentation Control by Hand Gestures Using Computer Vision and Google’...
Smart Presentation Control by Hand Gestures Using Computer Vision and Google’...Smart Presentation Control by Hand Gestures Using Computer Vision and Google’...
Smart Presentation Control by Hand Gestures Using Computer Vision and Google’...IRJET Journal
 
Cursor movement by hand gesture.pptx
Cursor movement by hand gesture.pptxCursor movement by hand gesture.pptx
Cursor movement by hand gesture.pptxRastogiAman
 
Six sense technology
Six sense technologySix sense technology
Six sense technologyGaurav Rawat
 
virtual mouse using hand gesture.pptx
virtual mouse using hand gesture.pptxvirtual mouse using hand gesture.pptx
virtual mouse using hand gesture.pptxsivaeswarreddy
 
Design of Image Projection Using Combined Approach for Tracking
Design of Image Projection Using Combined Approach for  TrackingDesign of Image Projection Using Combined Approach for  Tracking
Design of Image Projection Using Combined Approach for TrackingIJMER
 
HAND GESTURE CONTROLLED MOUSE
HAND GESTURE CONTROLLED MOUSEHAND GESTURE CONTROLLED MOUSE
HAND GESTURE CONTROLLED MOUSEIRJET Journal
 
Follow Me Robot Technology
Follow Me Robot TechnologyFollow Me Robot Technology
Follow Me Robot Technologyijsrd.com
 

Similar to Gesture control mouse using camera (20)

VIRTUAL MOUSE USING OPENCV
VIRTUAL MOUSE USING OPENCVVIRTUAL MOUSE USING OPENCV
VIRTUAL MOUSE USING OPENCV
 
AI Virtual Mouse
AI Virtual MouseAI Virtual Mouse
AI Virtual Mouse
 
Virtual Mouse Control Using Hand Gestures
Virtual Mouse Control Using Hand GesturesVirtual Mouse Control Using Hand Gestures
Virtual Mouse Control Using Hand Gestures
 
Mouse Simulation Using Two Coloured Tapes
Mouse Simulation Using Two Coloured TapesMouse Simulation Using Two Coloured Tapes
Mouse Simulation Using Two Coloured Tapes
 
IRJET- Finger Gesture Recognition Using Linear Camera
IRJET-  	  Finger Gesture Recognition Using Linear CameraIRJET-  	  Finger Gesture Recognition Using Linear Camera
IRJET- Finger Gesture Recognition Using Linear Camera
 
K1802027780
K1802027780K1802027780
K1802027780
 
Virtual Mouse Control Using Hand Gesture Recognition
Virtual Mouse Control Using Hand Gesture RecognitionVirtual Mouse Control Using Hand Gesture Recognition
Virtual Mouse Control Using Hand Gesture Recognition
 
A computer vision based virtual mouse
A computer vision based virtual mouseA computer vision based virtual mouse
A computer vision based virtual mouse
 
Controlling Computer using Hand Gestures
Controlling Computer using Hand GesturesControlling Computer using Hand Gestures
Controlling Computer using Hand Gestures
 
SEMINAR_PPT.pptx
SEMINAR_PPT.pptxSEMINAR_PPT.pptx
SEMINAR_PPT.pptx
 
A Survey on Detecting Hand Gesture
A Survey on Detecting Hand GestureA Survey on Detecting Hand Gesture
A Survey on Detecting Hand Gesture
 
This is future-Sixth Sense Technology
This is future-Sixth Sense TechnologyThis is future-Sixth Sense Technology
This is future-Sixth Sense Technology
 
Virtual Mouse Using Hand Gesture Recognition
Virtual Mouse Using Hand Gesture RecognitionVirtual Mouse Using Hand Gesture Recognition
Virtual Mouse Using Hand Gesture Recognition
 
Smart Presentation Control by Hand Gestures Using Computer Vision and Google’...
Smart Presentation Control by Hand Gestures Using Computer Vision and Google’...Smart Presentation Control by Hand Gestures Using Computer Vision and Google’...
Smart Presentation Control by Hand Gestures Using Computer Vision and Google’...
 
Cursor movement by hand gesture.pptx
Cursor movement by hand gesture.pptxCursor movement by hand gesture.pptx
Cursor movement by hand gesture.pptx
 
Six sense technology
Six sense technologySix sense technology
Six sense technology
 
virtual mouse using hand gesture.pptx
virtual mouse using hand gesture.pptxvirtual mouse using hand gesture.pptx
virtual mouse using hand gesture.pptx
 
Design of Image Projection Using Combined Approach for Tracking
Design of Image Projection Using Combined Approach for  TrackingDesign of Image Projection Using Combined Approach for  Tracking
Design of Image Projection Using Combined Approach for Tracking
 
HAND GESTURE CONTROLLED MOUSE
HAND GESTURE CONTROLLED MOUSEHAND GESTURE CONTROLLED MOUSE
HAND GESTURE CONTROLLED MOUSE
 
Follow Me Robot Technology
Follow Me Robot TechnologyFollow Me Robot Technology
Follow Me Robot Technology
 

Recently uploaded

ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...JhezDiaz1
 
Pharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdfPharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdfMahmoud M. Sallam
 
Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxthorishapillay1
 
Meghan Sutherland In Media Res Media Component
Meghan Sutherland In Media Res Media ComponentMeghan Sutherland In Media Res Media Component
Meghan Sutherland In Media Res Media ComponentInMediaRes1
 
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdfFraming an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdfUjwalaBharambe
 
Historical philosophical, theoretical, and legal foundations of special and i...
Historical philosophical, theoretical, and legal foundations of special and i...Historical philosophical, theoretical, and legal foundations of special and i...
Historical philosophical, theoretical, and legal foundations of special and i...jaredbarbolino94
 
Introduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher EducationIntroduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher Educationpboyjonauth
 
Painted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of IndiaPainted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of IndiaVirag Sontakke
 
Full Stack Web Development Course for Beginners
Full Stack Web Development Course  for BeginnersFull Stack Web Development Course  for Beginners
Full Stack Web Development Course for BeginnersSabitha Banu
 
Types of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptxTypes of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptxEyham Joco
 
Hierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementHierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementmkooblal
 
Blooming Together_ Growing a Community Garden Worksheet.docx
Blooming Together_ Growing a Community Garden Worksheet.docxBlooming Together_ Growing a Community Garden Worksheet.docx
Blooming Together_ Growing a Community Garden Worksheet.docxUnboundStockton
 
How to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxHow to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxmanuelaromero2013
 
DATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginnersDATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginnersSabitha Banu
 
Crayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon ACrayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon AUnboundStockton
 
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxPOINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxSayali Powar
 
MICROBIOLOGY biochemical test detailed.pptx
MICROBIOLOGY biochemical test detailed.pptxMICROBIOLOGY biochemical test detailed.pptx
MICROBIOLOGY biochemical test detailed.pptxabhijeetpadhi001
 

Recently uploaded (20)

TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdfTataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
 
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
 
Pharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdfPharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdf
 
Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptx
 
Model Call Girl in Bikash Puri Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Bikash Puri  Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Bikash Puri  Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Bikash Puri Delhi reach out to us at 🔝9953056974🔝
 
Meghan Sutherland In Media Res Media Component
Meghan Sutherland In Media Res Media ComponentMeghan Sutherland In Media Res Media Component
Meghan Sutherland In Media Res Media Component
 
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdfFraming an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
 
Historical philosophical, theoretical, and legal foundations of special and i...
Historical philosophical, theoretical, and legal foundations of special and i...Historical philosophical, theoretical, and legal foundations of special and i...
Historical philosophical, theoretical, and legal foundations of special and i...
 
Introduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher EducationIntroduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher Education
 
Painted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of IndiaPainted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of India
 
9953330565 Low Rate Call Girls In Rohini Delhi NCR
9953330565 Low Rate Call Girls In Rohini  Delhi NCR9953330565 Low Rate Call Girls In Rohini  Delhi NCR
9953330565 Low Rate Call Girls In Rohini Delhi NCR
 
Full Stack Web Development Course for Beginners
Full Stack Web Development Course  for BeginnersFull Stack Web Development Course  for Beginners
Full Stack Web Development Course for Beginners
 
Types of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptxTypes of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptx
 
Hierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementHierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of management
 
Blooming Together_ Growing a Community Garden Worksheet.docx
Blooming Together_ Growing a Community Garden Worksheet.docxBlooming Together_ Growing a Community Garden Worksheet.docx
Blooming Together_ Growing a Community Garden Worksheet.docx
 
How to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxHow to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptx
 
DATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginnersDATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginners
 
Crayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon ACrayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon A
 
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxPOINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
 
MICROBIOLOGY biochemical test detailed.pptx
MICROBIOLOGY biochemical test detailed.pptxMICROBIOLOGY biochemical test detailed.pptx
MICROBIOLOGY biochemical test detailed.pptx
 

Gesture control mouse using camera

  • 1. International Journal of Advanced Information Technology (IJAIT) Vol. 2, No.3, June 2012 DOI : 10.5121/ijait.2012.2303 37 Gesture Based Interface Using Motion and Image Comparison Shany Jophin1 , Sheethal M.S2 , Priya Philip3 , T M Bhruguram4 Dept of Computer .Science Adi Shankara Institute of Engineering And Technology, Kalady, shanyjophin.s@gmail.com ABSTRACT This paper gives a new approach for movement of mouse and implementation of its functions using a real time camera. Here we propose to change the hardware design. Most of the existing technologies mainly depend on changing the mouse parts features like changing the position of tracking ball and adding more buttons. We use a camera, colored substance, image comparison technology and motion detection technology to control mouse movement and implement its functions (right click, left click, scrolling and double click) . KEYWORDS HCI, Sixth Sense, VLCJ 1. INTRODUCTION Human computing interaction (HCI) is one of the important area of research were people try to improve the computer technology. Nowadays we find smaller and smaller devices being used to improve technology. Vision-based gesture and object recognition are another area of research. A simple interface like embedded keyboard, folder-keyboard and mini-keyboard already exists in today’s market. However, these interfaces need some amount of space to use and cannot be used while moving. Touch screen are also globally used which are good control interface and are being used in many applications. However, touch screens can-not be applied to desktop systems because of cost and other hardware limitations. By applying vision technology, colored substance, image comparison technology and controlling the mouse by natural hand gestures, we can reduce the work space required. In this paper, we propose a novel approach that uses a video device to control the mouse system properly. 2. RELATED WORK 2.1. Mouse free Vision-Based Human-Computer Interaction through Real-Time Hand Tracking and Gesture Recognition Vision-based interaction is an appealing option for replacing primitive human- computer interaction (HCI) using a mouse or touchpad. We propose a system for using a webcam to track a users hand and recognize gestures to initiate specific interactions. The contributions of our work will be to implement a system for hand tracking and simple gesture recognition in real time [1].
  • 2. International Journal of Advanced Information Technology (IJAIT) Vol. 2, No.3, June 2012 38 Many researchers in the human computer interaction and robotics fields have tried to control mouse movement using video devices. However, all of them used different methods to make a clicking event. One approach, by Erdem et al, used finger tip tracking to control the motion of the mouse. A click of the mouse button was implemented by defining a screen such that a click occurred when a user’s hand passed over the region [2, 3]. Another approach was developed by Chu-Feng Lien [4]. He used only the finger-tips to control the mouse cursor and click. His clicking method was based on image density, and required the user to hold the mouse cursor on the desired spot for a short period of time. Paul et al, used still another method to click. They used the motion of the thumb (from a ‘thumbs-up’ position to a fist) to mark a clicking event thumb. Movement of the hand while making a special hand sign moved the mouse pointer. 2.2. A Method for Controlling Mouse Movement using a Real-Time Camera This is a new approach for controlling mouse movement using a real-time camera. Most existing approaches involve changing mouse parts such as adding more buttons or changing the position of the tracking ball. Instead, we propose to change the hardware de-sign. Our method is to use a camera and computer vision technology, such as image segmentation and gesture recognition .Our method is to use a camera and computer vision technology, such as image segmentation and gesture recognition, to control mouse tasks (left and right clicking, double-clicking, and scrolling) and we show how it can perform everything current mouse devices can. This paper shows how to build this mouse control system [5]. 2.3. Sixth Sense ‘SixthSense' is a wearable gestural interface that augments the physical world around us with digital in-formation and lets us use natural hand gestures to interact with that information. The SixthSense prototype is comprised of a pocket projector, a mirror and a camera. The hardware components are coupled in a pendant like mobile wearable device. Both the projector and the camera are connected to the mobile computing device in the users pocket. The projector projects visual information enabling surfaces, walls and physical objects around us to be used as interfaces; while the camera recognizes and tracks user's hand gestures and physical objects using computer-vision based techniques [6]. 2.4. Vlcj The vlcj project is an Open Source project that pro-vides Java bindings for the excellent vlc media player from Video LAN. The bindings can be used to build media player client and server software using Java - everything from simply playing local media les to a full-blown video-on- demand streaming server is possible .vlcj is being used in diverse applications, helping to provide video capabilities to software in use on oceanographic research vessels and bespoke IPTV and home cinema solutions .vlcj is also being used to create software for an Open Source video camera at Elphel and video mapping for the Open Street Map project [7]. 2.5. Mouseless Mouseless is an invisible computer mouse that provides the familiarity of interaction of a physical mouse without actually needing a real hardware mouse. The Mouseless invention removes the requirement of having a physical mouse altogether but still provides the intuitive interaction of a physical mouse that we are familiar with. Mouseless consists of an Infrared (IR) laser beam (with line cap) and an Infrared camera. Both IR laser and IR camera are embedded in the computer. The laser beam module is modified with a line cap and placed such that it creates a plane of IR laser
  • 3. International Journal of Advanced Information Technology (IJAIT) Vol. 2, No.3, June 2012 39 just above the surface the computer sits on. The user cups their hand, as if a physical mouse was present underneath, and the laser beam lights up the hand which is in contact with the surface. The IR camera detects those bright IR blobs using computer vision. The change in the position and arrangements of these blobs are interpreted as mouse cursor movement and mouse clicks. As the user moves their hand the cursor on screen moves accordingly. When the user taps their index finger, the size of the blob changes and the camera recognizes the intended mouse click.[8] 2.6. Real-time Finger Tracking for Interaction In this work, they described an approach for human finger motion and gesture detection using two cameras. The target of pointing on a flat monitor or screen is identified using image processing and line intersection. This is accomplished by processing above and side images of the hand. The system is able to track the finger movement without building the 3D model of the hand. Coordinates and movement of the finger in a live video feed can be taken to become the coordinates and movement of the mouse pointer for human-computer interaction purpose.[9] 3. PROPOSED WORK Figure 1. The proposed working 3.1. Product and system features 3.1.1. Computer (Laptop)-Processor: Pentium4, Processor Speed: 1GHz, RAM Capacity: 512 MB, Hard disk: 40GB, Monitor: 15SVGA. 3.1.2. Webcam It is used for the image processing and also for capturing the image.
  • 4. International Journal of Advanced Information Technology (IJAIT) Vol. 2, No.3, June 2012 40 Video data format: 12.24-bit RGB, Image resolution: Max 2560*2084, Software enhanced menu display/sec: 30 in CIF mode, Menu signal bit: 42db, Lens: 6.00mm, Vision: +/-28,Focus range: 3 centimeter to limitless. 3.1.3. Finger Tip (Red and blue colored substance)- it is used as an alternative for mouse and control the functions of pointer. 3.1.4. Software Requirements Operating System: Windows XP, Windows vista, Windows 7. Code Behind: JAVA (Red Hat or Eclipse). 3.1.5. Internal Interface Requirements Swing, vlcj. 3.2. Tools used Laptop (include software like red hat, eclipse, vlcj), webcam, a colored device. The performance of the software is made to be improved by examining each pixel leaving behind its four consecutive pixels.
  • 5. International Journal of Advanced Information Technology (IJAIT) Vol. 2, No.3, June 2012 41 Figure 2. The flow chart 3.3. Functional Requirements Programmer has to follow a particular sequence of actions to implement their functions. They are Implementing the functions of vlcj to capture the images given by the user ,Store the captured images in a temporary storage buffer, Examining pixel( skipping consecutive four pixels ) of the image captured to find the red color, Finding out the mid points by examining x and y coordinates of the current area detected, The pointer reaches the position where the midpoint is found, Examine another pixel containing the red color and find out its midpoint, To move the pointer, compare the current pixel with previous pixel, If the compared two pixels are the same, then the pointer starts moving, Perform mouse functions depending on the length detected between the new computed Pixels. The User also has certain functions. The functions performed by the user in order are Place a red colored finger cap or any other red colored substance in front of the webcam, Move the substance in front of the webcam to see the pointer moving, selecting icons on the screen and doing mouse functions. International Journal of Advanced Information Technology (IJAIT) Vol. 2, No.3, June 2012 41 Figure 2. The flow chart 3.3. Functional Requirements Programmer has to follow a particular sequence of actions to implement their functions. They are Implementing the functions of vlcj to capture the images given by the user ,Store the captured images in a temporary storage buffer, Examining pixel( skipping consecutive four pixels ) of the image captured to find the red color, Finding out the mid points by examining x and y coordinates of the current area detected, The pointer reaches the position where the midpoint is found, Examine another pixel containing the red color and find out its midpoint, To move the pointer, compare the current pixel with previous pixel, If the compared two pixels are the same, then the pointer starts moving, Perform mouse functions depending on the length detected between the new computed Pixels. The User also has certain functions. The functions performed by the user in order are Place a red colored finger cap or any other red colored substance in front of the webcam, Move the substance in front of the webcam to see the pointer moving, selecting icons on the screen and doing mouse functions. International Journal of Advanced Information Technology (IJAIT) Vol. 2, No.3, June 2012 41 Figure 2. The flow chart 3.3. Functional Requirements Programmer has to follow a particular sequence of actions to implement their functions. They are Implementing the functions of vlcj to capture the images given by the user ,Store the captured images in a temporary storage buffer, Examining pixel( skipping consecutive four pixels ) of the image captured to find the red color, Finding out the mid points by examining x and y coordinates of the current area detected, The pointer reaches the position where the midpoint is found, Examine another pixel containing the red color and find out its midpoint, To move the pointer, compare the current pixel with previous pixel, If the compared two pixels are the same, then the pointer starts moving, Perform mouse functions depending on the length detected between the new computed Pixels. The User also has certain functions. The functions performed by the user in order are Place a red colored finger cap or any other red colored substance in front of the webcam, Move the substance in front of the webcam to see the pointer moving, selecting icons on the screen and doing mouse functions.
  • 6. International Journal of Advanced Information Technology (IJAIT) Vol. 2, No.3, June 2012 42 Figure 3. Hand gesture Figure 4. Hand gesture Figure 5. Hand gesture
  • 7. International Journal of Advanced Information Technology (IJAIT) Vol. 2, No.3, June 2012 43 Figure 6. Hand gesture 4. RESULTS From our implementation and execution of our program we found that the mouse pointer can be made to move and its functions can be implemented without the use of a touchpad or mouse .The pointer is moved with the help of our finger gestures by placing the specific color substance in our hand (any colored cap or any colored small substance) making us easy to use our system works. The performance of the software has been improved. Figure 7. Screenshot for mouse pointer functions in windows xp The Main problem encountered was when the finger shook a lot. Each time the finger position changed, illumination changed every frame. To fix this problem we added a code to make the cursor not to move when the previous and current red colored pixel difference is within 4 pixels. Technologies we use in this paper assume that all hand movement is properly coordinated.
  • 8. International Journal of Advanced Information Technology (IJAIT) Vol. 2, No.3, June 2012 44 Figure 8. Screenshot for mouse pointer functions in linux 5. FUTURE WORK There are still many improvements that can be made to our system like improving the performance of the current system and adding features such as enlarging and shrinking windows, closing window, etc. by using the palm and multiple fingers. The current system is variant to reflection and scale changes and requires proper hand gestures, good illumination technology and powerful camera for the performance of mouse functions. Precision can always be increased at the cost of recall by adding more stages, but each successive stage takes twice as much time to find harder negative samples and the applications which benefit from this technology. We present an image viewing application as an example of where this technology could lead to a more natural user interface. The same could be said for navigating something like Google Maps or browsing folders on a screen. However, the applications reach far beyond that. They are particularly compelling in situations where touch screens are not applicable or less than ideal. For example, with projection systems, there is no screen to touch. Here vision-based technology would provide an ideal replacement for touch screen technology. Similarly in public terminals, constant use results in the spread of dirt and germs. Vision-based systems would remove the need to touch such setups, and would result in improved interaction. 5.1. Enlarge, zoom in and shrink, zoom out More experiments were done to implement zooming and shrinking. Different hand gestures were analyzed to find the proper gesture for zoom and shrink functions. “Moving hand apart” was the initial hand gesture analyzed but later was discarded due to its limitation of using two hands. The most convenient hand gesture selected was expansion and contraction movement of thumb and index finger. Appropriate scaling factor for enlarging and shrinking are chosen. The scaling
  • 9. International Journal of Advanced Information Technology (IJAIT) Vol. 2, No.3, June 2012 45 across the thumb and index finger is taken for further procedures. The thumb and index finger movement is done repeatedly to scale the distance to implement these functions. Figure 9. Diagrammatic representation of zoom and shrink 6. CONCLUSION We developed a system to control the mouse cursor and implement its function using a real-time camera. We implemented mouse movement, selection of the icons and its functions like right, left, double click and scrolling. This system is based on image comparison and motion detection technology to do mouse pointer movements and selection of icon. However, it is difficult to get stable results because of the variety of lighting and detection of the same color in any other location in the background. Most algorithms used have illumination issues. From the results, we can expect that if the algorithms can work in all environments then our system will work more efficiently. This system could be useful in presentations and to reduce work space. In the future, we plan to add more features such as enlarging and shrinking windows, closing window, etc. by using the palm and multiple fingers. The performance of the software can be only improved by small percentage due to the lack of a powerful camera and a separate processor for this application. REFERENCES [1] Hojoon Park. A Method for Controlling Mouse Movement using a Real-Time Camera. ww.cs.brown.edu/research/pubs/theses/masters/2010/park.pdf2010. [2] Computer vision based mouse, A. Erdem, E. Yardimci, Y. Atalay, V. Cetin, A. E. Acoustics, Speech, and Signal Processing, 2002. Proceedings. (ICASS). IEEE International Conference. [3] Vision based Men-Machine Interaction http://www.ceng.metu.edu.tr/~vbi/. [4] Chu-Feng Lien, Portable Vision-Based HCI - A Real-time Hand Mouse System on Handheld Devices. [5] Ben askar Chris Jordan, Hyokwon Lee mouse free. http://www.seas.upenn.edu/cse400/CSE400-2009- 2010/nal-report/Jordan-lee.pdf, 2009-2010. [6] Pranavmistry,Sixthsense. http://www.youtube.com/watch?v=ZfV4R4x2SK0, 2010. [Online; accessed 28-november-2010]. [7] Mark dot lee Java bindings for the vlc media player. http://www.capricasoftware.co.uk/vlcj/index.php, 2011. [Online; accessed 1-may-2011]. [8] Pranavmistry,Mouseless http://www.pranavmistry.com/projects/mouseless/,2010 [Online; accessed 28-november-2010]. [9] Shaker, N.; Abou Zliekha, M.;Damascus Univ., Damascus, Real-time Finger Tracking for Interaction http://ieeexplore.ieee.org/search/freesrchabstract.jsp 2007
  • 10. International Journal of Advanced Information Technology (IJAIT) Vol. 2, No.3, June 2012 46 Authors Shany Jophin received the BTECH degree from the Department of Computer Science and Engineering, Government Engineering College, Kannur University, Wayand, in 2011. She is currently doing Masters degree in the Department of Computer Science and Engineering, Adi Shankara Institute of Engineering and Technology, MG University, Kerala, India.Her research interests include Cyber forensics, Networking and Information Security. Sheethal M.S received the BTECH degree from the Department of Computer Science and Engineering KMEA engineering college, Edathala, in 2007. She is currently doing Masters Degree in the Department of Computer Science and Engineering, Adi shankara institute of engineering and technology, MG university, kerala. Her research interests include fuzzy clustering and image processing. Priya Philip received the BE degree from the Department of Computer Science and Engineering, Sun college of engineering and technology, Nagarcoil , Anna university in 2011. She is currently doing Masters Degree in the Department of Computer Science and Engineering,, Adi Shankara Institute of Engineering and Technology, MG University, Kerala, India T M Bhraguram currently working as assistant professor of Adi Shankara Institute of Engineering and Technology, Department of information technology, Kerala. He has 4 years of teaching experience in various engineering colleges and industry experience in Hiranandani Group, Mumbai, India.