SlideShare a Scribd company logo
1 of 8
Download to read offline
International Journal of Engineering and Techniques - Volume 2 Issue 6, Nov – Dec 2016
ISSN: 2395-1303 http://www.ijetjournal.org Page 140
Real Time Implementation of Eye Tracking
System Using Arduino Uno Based Hardware
Interface
B.K. Venugopal1
, Dilson D’souza2
1
Associate professor 2
Master of Engineering
Dept. of Electronics and Communication, University Visvesvaraya College of Engineering, Bangalore
1. INTRODUCTION
1. Wireless LAN Medium Access Control (MAC) and
Eye tracking is one of the significant way towards
measuring either the purpose of look (where one is
looking) or the movement of an eye with respect to
the head. An eye tracker is a gadget for measuring
eye positions and eye development. Eye trackers
are utilized as a part of exploration on the visual
framework, in brain research, in psycholinguistics,
showcasing, as an information gadget for human-
PC cooperation, and in item plan. There are various
techniques for measuring eye development. The
most mainstream variation utilizes video pictures
from which the eye position is extricated. Different
techniques use look curls or depend on the
electrooculogram.
The most broadly utilized current plans are video-
based eye trackers. A camera concentrates on one
or both eyes and records their development as the
viewer takes a consideration at some sort of boost.
Most present day eye-trackers utilize the focal point
of the pupil and infrared/close infrared non-
collimated light to make corneal reflections (CR).
The vector between the understudy focus and the
corneal reflections can be utilized to process the
purpose of respect on surface or the look heading.
A basic alignment technique of the individual is
generally required before utilizing the eye tracker.
Two general sorts of infrared/close infrared
(otherwise called dynamic light) eye following
procedures are utilized: enhanced pupil and dim
pupil. Their distinction depends on the area of the
light source as for the optics. On the off chance that
the luminance is coaxial with the optical way, then
the eye goes about as a retroreflector as the light
reflects off the retina making a splendid student
impact like red eye. In the event that the light
source is balanced from the optical way, then the
student seems dull on the grounds that the
retroreflection
from the retina is coordinated far from the camera.
Splendid pupil following makes more noteworthy
iris/pupil contrast, permitting more vigorous eye
following with all iris pigmentation, and incredibly
diminishes obstruction brought about by eyelashes
and other darkening features.
It likewise permits following in lighting conditions
going from aggregate haziness to brilliant. Yet,
splendid understudy methods are not powerful to
RESEARCH ARTICLE OPEN ACCESS
Abstract:
Eye tracking system has played a significant role in many of today’s applications ranging from military
applications to automotive industries and healthcare sectors. In this paper, a novel system for eye tracking and
estimation of its direction of movement is performed. The proposed system is implemented in real time using an
arduino uno microcontroller and a zigbee wireless device. Experimental results show a successful eye tracking and
movement estimation in real time scenario using the proposed hardware interface.
Keywords — Arduino based hardware, eye tracking system, Binarization, eye region classification, Hough
transform, Viola-Jones algorithm and zigbee wireless device.
International Journal of Engineering and Techniques - Volume 2 Issue 6, Nov – Dec 2016
ISSN: 2395-1303 http://www.ijetjournal.org Page 141
track outside, as unessential IR sources meddle with
monitoring.
Another, less utilized, strategy known to be an
inactive light. It utilizes the noticeable light to
luminance, which may make a few diversions users.
Another test with this strategy is the differentiation
of the understudy is not exactly in the dynamic light
strategies, accordingly, the focal point of iris is
utilized for figuring the vector instead. This count
needs to distinguish the limit of iris and the white
sclera (limbus following). It introduces another test
for vertical eye developments because of block of
eyelids.
The following sections in this paper are as follows.
The first section gives a brief introduction
concerning the significance of the area of interest
and the general problems associated with it. A
review of literature and related works is given in
section 2. The proposed system and implementation
concerning the architecture and work flow of the
project is given in section 3 and 4 respectively. The
obtained results is given in section 5 and finally the
conclusion of the overall work along with
references is given.
2. LITERATURE SURVEY AND RELATED
WORKS
The application which extensively uses the concept
of eye tracking include sectors in automotive
industries, medical research, Fatigue simulation,
Vehicle simulators, cognitive studies, computer
vision, activity recognition, etc. The significance of
eye detection and tracking in commercial
applications has increased over a period of time.
This significance of eye tracking applications lead
to more efficient and robust designs which is
necessary in many of today’s modern devices.
An extensive review of literature has been done in
the field of healthcare applications concerning the
eye tracking system. Some of these methods are
mentioned below.
Z. Sharafi et. al [1] performed an evaluation in the
metrics concerning the eye tracking in the software
engineering. The author brings about different
metrics concerning the eye tracking into one
common standard to facililate researchers facing
difficulty in analyzing the eye- tracking experiment.
The work also produces definitions on different
metrics used along with suggestions of using the
same from other related fields.
S. Chandra et. al [2] proposed an application based
on eye-tracking and their respective interaction with
human beings. The work mainly focuses on the
determination of direction of gaze along with their
position and their and sequence of movement. The
work focuses on three aspects in the eye-tracking
experiment. The first objective was to provide the
user an insight into the underlying issues present in
the eye-movement technology. The second
objective was to provide the user a guidance
concerning the development of the eye-movement
technology. The third objective was to recognize
the challenges and prospects in building a Man And
Machine Interfacing (MAMI) systems using the
principle of eye-tracking. Experimental results
show a reduced computational time in with respect
to gaze input as compared to mouse input.
Y. Zhang et. al [3] performed a comparative
analysis on the stationary and mobile eye tracking
wayfinding system considering the EXITs design.
A mobile tracking system was used to detect the
eye movement where the EXIT design was used in
the building. A stationary eye- tracking technique
was used for the same purpose. A comparative
analysis was performed which resulted in some
conclusions which was primarily procured from
elements of appearnece, design and placement of
the EXIT on the building. Empirical methodologies
were introduced in the field of research with respect
to wayfinding and space decision.
D. Miyamoto et. al [4] proposed a method for the
enforcement of phishing prevention habits using the
eye tracking technique. The experiment was
performed where the eye movement of 23
participants were analyzed and also considered that
the novice participant did not tend to possess a
similar habit. A prototype named eyebit was
developed which required the participant to look in
the address bar and consequently enter some
address in the web forms. The system was designed
such that it deactivates and reactivates (control
parameter) based on the participant eye movement
pattern. Experimental results show that the
significance in effectiveness of the proposal which
International Journal of Engineering and Techniques - Volume 2 Issue 6, Nov – Dec 2016
ISSN: 2395-1303 http://www.ijetjournal.org Page 142
was based on the participant eye movement
prediction even though an incovineance was
observed in the Eyebit system.
R. G Bozomitu et. al [5] proposed an eye tracking
system based on the circular Hough transform
algorithm. The experiment was aimed towards
producing benefits to the neuromotor disabled
patients. The signals were captured using an
infrared video camera along with a pc. The process
of eye tracking system implemented in this work
uses a keyword technology. Experimental results
show that the optimal performance of the pupil eye
detection movement was based on a trade-off
between the algorithms computational time and the
precision of detection of the pupil region in the eye-
movement.
R. G. Bozomito et. al [6] performed a comparative
analysis on the eye tracking applications with
respect to its pupil. The experiment was a
comparison between two eye-movement detecting
algorithms mainly the circular Hough transform and
the Starbust transform. A parameter based
algorithm was implemented in the circular hough
transform and a feature based algorithm was
implemented in the in the Starbust transform. In
order to improve the curser stability a Gaussian
smoothing filter along with appropriate spike
removal technique was implemented. Experimental
results showed that the Starbust based
transformation had a higher accuracy as compared
to circular hough based transformation with respect
to cursor movement, but due to the notably high
disparity of the pupil center position in successive
frames the Starbust based transformation had high
levels of noise.
O. Mazmar et. al [7] proposed a eye-ball tracking
system which was performed in a real time
environment. The experiment was aimed towards
people having disabled neuromotor systems. The
experiment was conducted by developing a human
computer interface facilitating the neurological
control of the disabled patients by observing only
the eye-movement of the patients. A serial
communication was performed which sent data
from the webcam to a MATALB based simulating
tool. A segmentation process followed by a centroid
calculation for the pupil was performed which
results in producing a control signal from the
patient. The conducted experiment resulted in a real
time simulation of an eye tracking system.
M. Kim et. al [8] proposed a system of eye
movement detection and tracking based on the
cardinal direction. The experiment was conducted
in such a way that the participants were asked to
look towards north in a given cardinal direction and
the respective reaction time for each cardinal
direction was observed. The experimental results
demonstrates that the participant’s eye is fixed on
the character part, corner of shape and the cross
over point. The cardinal direction was observed to
be small in range considering the gaze movement.
A significant observation proclaimed in this
experiment was that the clarity of the cardinal
direction confirmed the presence or absence of the
character information.
C. Jin et. al [9] proposed a technique based on the
gaze point compensation technique using the eye
tracking system. The work proposes a length
compensation algorithm by considering the gaze
point algorithm. Experimental result showed a
deviation of less than 1 cm in both the vertical and
horizontal directions.
Z. Zhang et. al [10] proposed a classification
method to separate the novice from experienced
drivers considering their reflect drivers attention
and their skill. The classifier used in this
experiment is the binary Gaussian classifier
considering a two dimensional data which is
obtained from the gaze behavior from the
participants. A comparative analysis between the
Gaussian based classifier and the Gaussian mixture
models were performed. Experimental results show
that the performance of the proposed Gaussian
based classifier was higher in efficiency as
compared to the Gaussian mixture models.
F. B Taher et. al [11] proposed a controller based
electric powered wheelchair (EPW) system for
certain types of disabled persons based on a
combination of EEG (Electroencephalogram)
signals with that of the eye tracking system. The
proposed technique illustrate the significance of
using a multi source control process in the EPW.
The first part involves elaboration of separate
control techniques involving EEG and the eye
tracking system. The second part involves a
combination of the previous two techniques by
International Journal of Engineering and Techniques - Volume 2 Issue 6, Nov – Dec 2016
ISSN: 2395-1303 http://www.ijetjournal.org Page 143
considering a data fusion algorithm. Experimental
results show that by considering only EEG based
control signal without the inclusion of eye tracking
system, the result was much higher in performance,
but the use of EEG devices had a disadvantage of
limited energy. Hence by implementing the
proposed system this limitation was resolved.
K. Kurzhals et. al [12] proposed an eye tracking
system based on computer visualization system.
This work mainly deals with the development of tri-
dimensional fly through animation system. The
technique portrayed an accurate representation of
the distribution of the galaxy which was visually
and aesthetically pleasing to the observer.
F. Zhou et. al [13] proposed an eye tracking system
which was based on the particle filtering algorithm.
First an AdaBoost based classifier was
implemented to procure the information of the eye
region. Performing a comparison of two
consecutive frames in a video sequence the eye
region information is seeked, for this purpose the
particle filtering algorithm was implemented.
Experimental result shows an improvement in the
computational time between consecutive frames
due to the reduced search region for the human eye.
K. Kuzhals et. al [14] proposed an eye tracking
mechanism primarily for the application involving
Personal Visual Analytics (VSA). This work
primarily explains the challenges which arises in
real time scenarios in context of VSA along with
the prospective of potential research in the
respective application. The author also presents a
technique for representing the area of interest
considering multiple videos
3. PROPOSED SYSTEM
The purpose of the proposed system is to perform
detection and tracking of the pupil region of the
eye. The application of this system could be found
in areas ranging from military applications to
healthcare sectors. The proposed system mainly
consists of three general modules. The first module
is where the video is captured in real time and the
eye detection is performed using the Viola-jones
algorithm. The second module is used to perform
binarization and application of Hough
transformation to detect the pupil region of the eye.
The third module is used to perform the detection
and tracking of the movement of the pupil region of
the eye and to set a direction respectively. The
obtained direction is then sent to a hardware
experimental setup consisting of an Arduino uno
microcontroller with a wireless zigbee device for
wireless transmission of data.
The general architecture for the proposed system is
given in fig 1 as shown below. The purpose of the
proposed system is the detection, identification and
tracking of the pupil region of the eye in real time
scenario. The video is captured from the image
capturing device and frames are extracted, for each
frame the three module defined in the previous
section is performed. The resulting direction data of
the movement of the pupil region of the eye is sent
to the hardware experimental setup as control
signal. The functionalities associated for the
proposed eye tracking system are basically divided
into three modules which consists of image pre-
processing, pupil region detection and pupil
detection and movement detection.
4. IMPLEMENTATION
The procedure for the implementation for the eye
tracking system is given as follows.
In the first module of image pre-preprocessing, the
video is captured from an image capturing device,
the respect frame is extracted from the video, the
image which is in the RGB format is converted to a
single dimension gray scale format. Gamma
correction is applied to the image to set the image
to an ideal brightness and contrast. The gamma
correction is computed by controlling a parameter
called the Gamma factor. The eye region is detected
using the viola Jones algorithm, In this algorithm
the features are extracted using the haar wavelet
transformation, An integral image is created and the
corresponding training is performed suing the
AdaBoost training method and finally the cascading
classifiers are used for the detection of the eye
region in the face. The threshold is set for which the
pixel value of the image will be zero value for value
below the threshold and the pixel value will be 1 if
the value is above the threshold. The user adjusts
this setting until the pupil region is clearly
identified and defined in the eye. These parameters
is then to Binarization process.
International Journal of Engineering and Techniques - Volume 2 Issue 6, Nov – Dec 2016
ISSN: 2395-1303 http://www.ijetjournal.org Page 144
Fig 1: proposed system
The second module consists of the definition of the
pupil region of the proposed system, initially a
minimum and maximum radii are defined for which
the values (or radii value) identified within this
range will be termed as circle. The set parameters
(i.e the two radii) are then sent to the Hough
Transformation which detects and defines the
circular region in the eye. (fig. 3)
The third module consists of the detection and
movement of the pupil region in the eye. A
reference point is set to the pupil region which
consists of a matrix having radii and centers as rows
and columns. A comparative analysis is performed
with respect to previous frame and the current
frame considering the pixel position of the pupil
region in the eye. The condition given below are set
for the direction of movement of the vehicle based
pupil movement. (fig.4)
Condition 1: if rows of current frame is greater than
the rows of previous frame, then move ‘RIGHT’
Condition 2: if rows of current frame is lesser than
the rows of previous frame, then move ‘LEFT’
Condition 3: if columns of current frame is greater
than the columns of previous frame, then move
‘FORWARD’
Condition 4: if columns of current frame is lesser
than the columns of previous frame, then ‘STOP’
The hardware experimental setup is represented as shown
in fig. 5.
.
Fig 2: proposed methodology for eye detection and
binarization
Fig 3: proposed methodology for Hough Transformation
Fig 4: proposed methodology for eye movement detection
International Journal of Engineering and Techniques
ISSN: 2395-1303
Fig 5: Hardware experimental setup
5. SIMULATION RESULTS AND
PARAMETRIC EVALUATIONS
The evaluation performed with respect to image
forgery detection is mentioned as follo
Database
Table 1: database for the proposed eye tracking
system
Sl.no Parameters Description (value)
1. Video Device Integrated webcam
2. Video format mjpg 1024X768
3. Returned
colour space
RGB
4. trigger Infinite
5. Frames/trigger 1
6. n Number of frames
(126)
7. Image
resolution
1024X768X3
8. Number of
frames
126
Parametric analysis
Table 2: parametric analysis for the proposed
eye tracking system
Sl.no Parameters Description
1. Video
Device
webcam
2. n Number of frames
3. g Gamma factor
4. th Threshold value for
binarization
5. min Minimum radius for pupil
6. max Maximum radius for pupil
7. dir Direction of pupil
movement
International Journal of Engineering and Techniques - Volume 2 Issue 6, Nov –
http://www.ijetjournal.org
The evaluation performed with respect to image
forgery detection is mentioned as follows,
Table 1: database for the proposed eye tracking
Description (value)
Integrated webcam
mjpg 1024X768
Number of frames
1024X768X3
Table 2: parametric analysis for the proposed
Number of frames
Gamma factor
Threshold value for
Minimum radius for pupil
Maximum radius for pupil
Direction of pupil
1. Mean Square Error (MSE):
defined as measurement of average of square of
the error or deviation between the obtained and
reference data. It is mathematically defined as
follows,
∑ ………………… (1)
Where,  reference data
 Estimated data
2. Peak signal to noise ratio
is defined as the log of ratio of power of noise
to its respective noise in the image. It is
mathematically defined as,
10 log ………………
Where,  maximum numb
the image
3. Sensitivity: Sensitivity is defined as the ratio of
true positive to sum of true positive and false
negative. It is mathematically defined as shown
in eq.3,
Sensitivity = TP/ (TP + FN)
Where, FN  False negative
TP  True Positive
4. Specificity: It is defined as the ratio of true
negative to sum of true negative and false
positive. It is mathematically represented as
shown in eq.5,
Specificity = TN/ (TN + FP)
Where, TN  True Negative
FP  False Positive
Table 3: Experimental results
parameters
Elapsed time
PSNR
TP
TN
FP
FN
– Dec 2016
Page 145
Mean Square Error (MSE): The MSE is
defined as measurement of average of square of
the error or deviation between the obtained and
reference data. It is mathematically defined as
………………… (1)
(PSNR): The PSNR
is defined as the log of ratio of power of noise
to its respective noise in the image. It is
……………… (2)
maximum number of pixels in
: Sensitivity is defined as the ratio of
true positive to sum of true positive and false
negative. It is mathematically defined as shown
Sensitivity = TP/ (TP + FN) …………… (3)
False negative
: It is defined as the ratio of true
negative to sum of true negative and false
positive. It is mathematically represented as
Specificity = TN/ (TN + FP) …………… (4)
True Negative
False Positive
Table 3: Experimental results
image
33.82 s
14.2 (avg.)
65/126
30/126
20/126
11/126
International Journal of Engineering and Techniques - Volume 2 Issue 6, Nov – Dec 2016
ISSN: 2395-1303 http://www.ijetjournal.org Page 146
6. CONCLUSION
For the experiment considering the proposed
eye tracking system, a successful interface
between the computer and the hardware was
achieved. Binarization and Hough
transformation was performed for the successful
detection, identification and tracking of the
pupil region of the eye. The hardware setup was
built using an Arduino uno microcontroller and
zigbee wireless device. Experimental result
show a successful detection and tracking (with
respect to direction) of the pupil region of the
eye which is serially connected to the hardware
interface as a control signal.
REFERENCES
[1] Sharafi, Zohreh, Timothy Shaffer, and
Bonita Sharif. "Eye-Tracking Metrics in
Software Engineering." In 2015 Asia-
Pacific Software Engineering Conference
(APSEC), pp. 96-103. IEEE, 2015.
[2] Chandra, S., Sharma, G., Malhotra, S., Jha,
D. and Mittal, A.P., 2015, December. Eye
tracking based human computer interaction:
Applications and their uses. In 2015
International Conference on Man and
Machine Interfacing (MAMI) (pp. 1-5).
IEEE.
[3] Zhang, Y., Zheng, X., Hong, W. and Mou,
X., 2015, December. A comparison study of
stationary and mobile eye tracking on
EXITs design in a wayfinding system.
In 2015 Asia-Pacific Signal and Information
Processing Association Annual Summit and
Conference (APSIPA) (pp. 649-653). IEEE.
[4] Miyamoto, D., Iimura, T., Blanc, G.,
Tazaki, H. and Kadobayashi, Y., 2014,
September. EyeBit: eye-tracking approach
for enforcing phishing prevention habits.
In 2014 Third International Workshop on
Building Analysis Datasets and Gathering
Experience Returns for Security
(BADGERS) (pp. 56-65). IEEE.
[5] Bozomitu, R.G., Păsărică, A., Cehan, V.,
Lupu, R.G., Rotariu, C. and Coca, E., 2015,
November. Implementation of eye-tracking
system based on circular Hough transform
algorithm. In E-Health and Bioengineering
Conference (EHB), 2015 (pp. 1-4). IEEE.
[6] Păsărică, A., Bozomitu, R.G., Cehan, V.,
Lupu, R.G. and Rotariu, C., 2015, October.
Pupil detection algorithms for eye tracking
applications. In Design and Technology in
Electronic Packaging (SIITME), 2015 IEEE
21st International Symposium for (pp. 161-
164). IEEE.
[7] Mazhar, O., Shah, T.A., Khan, M.A. and
Tehami, S., 2015, October. A real-time
webcam based Eye Ball Tracking System
using MATLAB. In Design and Technology
in Electronic Packaging (SIITME), 2015
IEEE 21st International Symposium for (pp.
139-142). IEEE.
[8] Kim, M., Morimoto, K. and Kuwahara, N.,
2015, July. Using Eye Tracking to
Investigate Understandability of Cardinal
Direction. In Applied Computing and
Information Technology/2nd International
Conference on Computational Science and
Intelligence (ACIT-CSI), 2015 3rd
International Conference on(pp. 201-206).
IEEE.
[9] Jin, C. and Li, Y., 2015, August. Research
of Gaze Point Compensation Method in Eye
Tracking System. In Intelligent Human-
Machine Systems and Cybernetics (IHMSC),
2015 7th International Conference on (Vol.
2, pp. 12-15). IEEE.
[10] Zhang, Z., Kubo, T., Watanabe, J.,
Shibata, T., Ikeda, K., Bando, T., Hitomi, K.
and Egawa, M., 2015, July. A classification
method between novice and experienced
drivers using eye tracking data and Gaussian
process classifier. In Society of Instrument
and Control Engineers of Japan (SICE),
2015 54th Annual Conference of the (pp.
1409-1412). IEEE.
[11] Taher, F.B., Amor, N.B. and Jallouli,
M., 2015, September. A multimodal
wheelchair control system based on EEG
signals and Eye tracking fusion.
InInnovations in Intelligent SysTems and
Applications (INISTA), 2015 International
Symposium on (pp. 1-8). IEEE.
International Journal of Engineering and Techniques - Volume 2 Issue 6, Nov – Dec 2016
ISSN: 2395-1303 http://www.ijetjournal.org Page 147
[12] Kurzhals, K., Burch, M., Pfeiffer, T.
and Weiskopf, D., 2015. Eye Tracking in
Computer-Based Visualization. Computing
in Science & Engineering, 17(5), pp.64-71.
[13] Zhou, F., Chen, W. and Fang, H.,
2014, November. Robust eye tracking and
location method based on Particle filtering
algorithm. In 2014 IEEE 3rd International
Conference on Cloud Computing and
Intelligence Systems (pp. 247-252). IEEE.
[14] Kurzhals, K. and Weiskopf, D.,
2015. Eye Tracking for Personal Visual
Analytics. IEEE computer graphics and
applications, 35(4), pp.64-72.

More Related Content

What's hot

Ijarcet vol-2-issue-4-1352-1356
Ijarcet vol-2-issue-4-1352-1356Ijarcet vol-2-issue-4-1352-1356
Ijarcet vol-2-issue-4-1352-1356Editor IJARCET
 
IRJET-Unconstraint Eye Tracking on Mobile Smartphone
IRJET-Unconstraint Eye Tracking on Mobile SmartphoneIRJET-Unconstraint Eye Tracking on Mobile Smartphone
IRJET-Unconstraint Eye Tracking on Mobile SmartphoneIRJET Journal
 
IRJET - Real Time Facial Analysis using Tensorflowand OpenCV
IRJET -  	  Real Time Facial Analysis using Tensorflowand OpenCVIRJET -  	  Real Time Facial Analysis using Tensorflowand OpenCV
IRJET - Real Time Facial Analysis using Tensorflowand OpenCVIRJET Journal
 
Evaluation of Iris Recognition System on Multiple Feature Extraction Algorith...
Evaluation of Iris Recognition System on Multiple Feature Extraction Algorith...Evaluation of Iris Recognition System on Multiple Feature Extraction Algorith...
Evaluation of Iris Recognition System on Multiple Feature Extraction Algorith...Editor IJCATR
 
IRJET- Survey of Iris Recognition Techniques
IRJET- Survey of Iris Recognition TechniquesIRJET- Survey of Iris Recognition Techniques
IRJET- Survey of Iris Recognition TechniquesIRJET Journal
 
Evaluation of Iris Recognition System on Multiple Feature Extraction Algorith...
Evaluation of Iris Recognition System on Multiple Feature Extraction Algorith...Evaluation of Iris Recognition System on Multiple Feature Extraction Algorith...
Evaluation of Iris Recognition System on Multiple Feature Extraction Algorith...Editor IJCATR
 
Real Time Blinking Detection Based on Gabor Filter
Real Time Blinking Detection Based on Gabor FilterReal Time Blinking Detection Based on Gabor Filter
Real Time Blinking Detection Based on Gabor FilterWaqas Tariq
 
Eye tracking and detection by using fuzzy template matching and parameter bas...
Eye tracking and detection by using fuzzy template matching and parameter bas...Eye tracking and detection by using fuzzy template matching and parameter bas...
Eye tracking and detection by using fuzzy template matching and parameter bas...IAEME Publication
 
NEURAL NETWORK APPROACH FOR EYE DETECTION
NEURAL NETWORK APPROACH FOR EYE DETECTIONNEURAL NETWORK APPROACH FOR EYE DETECTION
NEURAL NETWORK APPROACH FOR EYE DETECTIONcscpconf
 
A Review on Face Detection under Occlusion by Facial Accessories
A Review on Face Detection under Occlusion by Facial AccessoriesA Review on Face Detection under Occlusion by Facial Accessories
A Review on Face Detection under Occlusion by Facial AccessoriesIRJET Journal
 
Comparative Study of Lip Extraction Feature with Eye Feature Extraction Algor...
Comparative Study of Lip Extraction Feature with Eye Feature Extraction Algor...Comparative Study of Lip Extraction Feature with Eye Feature Extraction Algor...
Comparative Study of Lip Extraction Feature with Eye Feature Extraction Algor...Editor IJCATR
 
Review of face detection systems based artificial neural networks algorithms
Review of face detection systems based artificial neural networks algorithmsReview of face detection systems based artificial neural networks algorithms
Review of face detection systems based artificial neural networks algorithmsijma
 
Graduation Project - Face Login : A Robust Face Identification System for Sec...
Graduation Project - Face Login : A Robust Face Identification System for Sec...Graduation Project - Face Login : A Robust Face Identification System for Sec...
Graduation Project - Face Login : A Robust Face Identification System for Sec...Ahmed Gad
 
INTEGRATING HEAD POSE TO A 3D MULTITEXTURE APPROACH FOR GAZE DETECTION
INTEGRATING HEAD POSE TO A 3D MULTITEXTURE APPROACH FOR GAZE DETECTIONINTEGRATING HEAD POSE TO A 3D MULTITEXTURE APPROACH FOR GAZE DETECTION
INTEGRATING HEAD POSE TO A 3D MULTITEXTURE APPROACH FOR GAZE DETECTIONijma
 
IRJET- Survey on Face Detection Methods
IRJET- Survey on Face Detection MethodsIRJET- Survey on Face Detection Methods
IRJET- Survey on Face Detection MethodsIRJET Journal
 
A comparison of multiple wavelet algorithms for iris recognition 2
A comparison of multiple wavelet algorithms for iris recognition 2A comparison of multiple wavelet algorithms for iris recognition 2
A comparison of multiple wavelet algorithms for iris recognition 2IAEME Publication
 
Cross Pose Facial Recognition Method for Tracking any Person's Location an Ap...
Cross Pose Facial Recognition Method for Tracking any Person's Location an Ap...Cross Pose Facial Recognition Method for Tracking any Person's Location an Ap...
Cross Pose Facial Recognition Method for Tracking any Person's Location an Ap...ijtsrd
 

What's hot (17)

Ijarcet vol-2-issue-4-1352-1356
Ijarcet vol-2-issue-4-1352-1356Ijarcet vol-2-issue-4-1352-1356
Ijarcet vol-2-issue-4-1352-1356
 
IRJET-Unconstraint Eye Tracking on Mobile Smartphone
IRJET-Unconstraint Eye Tracking on Mobile SmartphoneIRJET-Unconstraint Eye Tracking on Mobile Smartphone
IRJET-Unconstraint Eye Tracking on Mobile Smartphone
 
IRJET - Real Time Facial Analysis using Tensorflowand OpenCV
IRJET -  	  Real Time Facial Analysis using Tensorflowand OpenCVIRJET -  	  Real Time Facial Analysis using Tensorflowand OpenCV
IRJET - Real Time Facial Analysis using Tensorflowand OpenCV
 
Evaluation of Iris Recognition System on Multiple Feature Extraction Algorith...
Evaluation of Iris Recognition System on Multiple Feature Extraction Algorith...Evaluation of Iris Recognition System on Multiple Feature Extraction Algorith...
Evaluation of Iris Recognition System on Multiple Feature Extraction Algorith...
 
IRJET- Survey of Iris Recognition Techniques
IRJET- Survey of Iris Recognition TechniquesIRJET- Survey of Iris Recognition Techniques
IRJET- Survey of Iris Recognition Techniques
 
Evaluation of Iris Recognition System on Multiple Feature Extraction Algorith...
Evaluation of Iris Recognition System on Multiple Feature Extraction Algorith...Evaluation of Iris Recognition System on Multiple Feature Extraction Algorith...
Evaluation of Iris Recognition System on Multiple Feature Extraction Algorith...
 
Real Time Blinking Detection Based on Gabor Filter
Real Time Blinking Detection Based on Gabor FilterReal Time Blinking Detection Based on Gabor Filter
Real Time Blinking Detection Based on Gabor Filter
 
Eye tracking and detection by using fuzzy template matching and parameter bas...
Eye tracking and detection by using fuzzy template matching and parameter bas...Eye tracking and detection by using fuzzy template matching and parameter bas...
Eye tracking and detection by using fuzzy template matching and parameter bas...
 
NEURAL NETWORK APPROACH FOR EYE DETECTION
NEURAL NETWORK APPROACH FOR EYE DETECTIONNEURAL NETWORK APPROACH FOR EYE DETECTION
NEURAL NETWORK APPROACH FOR EYE DETECTION
 
A Review on Face Detection under Occlusion by Facial Accessories
A Review on Face Detection under Occlusion by Facial AccessoriesA Review on Face Detection under Occlusion by Facial Accessories
A Review on Face Detection under Occlusion by Facial Accessories
 
Comparative Study of Lip Extraction Feature with Eye Feature Extraction Algor...
Comparative Study of Lip Extraction Feature with Eye Feature Extraction Algor...Comparative Study of Lip Extraction Feature with Eye Feature Extraction Algor...
Comparative Study of Lip Extraction Feature with Eye Feature Extraction Algor...
 
Review of face detection systems based artificial neural networks algorithms
Review of face detection systems based artificial neural networks algorithmsReview of face detection systems based artificial neural networks algorithms
Review of face detection systems based artificial neural networks algorithms
 
Graduation Project - Face Login : A Robust Face Identification System for Sec...
Graduation Project - Face Login : A Robust Face Identification System for Sec...Graduation Project - Face Login : A Robust Face Identification System for Sec...
Graduation Project - Face Login : A Robust Face Identification System for Sec...
 
INTEGRATING HEAD POSE TO A 3D MULTITEXTURE APPROACH FOR GAZE DETECTION
INTEGRATING HEAD POSE TO A 3D MULTITEXTURE APPROACH FOR GAZE DETECTIONINTEGRATING HEAD POSE TO A 3D MULTITEXTURE APPROACH FOR GAZE DETECTION
INTEGRATING HEAD POSE TO A 3D MULTITEXTURE APPROACH FOR GAZE DETECTION
 
IRJET- Survey on Face Detection Methods
IRJET- Survey on Face Detection MethodsIRJET- Survey on Face Detection Methods
IRJET- Survey on Face Detection Methods
 
A comparison of multiple wavelet algorithms for iris recognition 2
A comparison of multiple wavelet algorithms for iris recognition 2A comparison of multiple wavelet algorithms for iris recognition 2
A comparison of multiple wavelet algorithms for iris recognition 2
 
Cross Pose Facial Recognition Method for Tracking any Person's Location an Ap...
Cross Pose Facial Recognition Method for Tracking any Person's Location an Ap...Cross Pose Facial Recognition Method for Tracking any Person's Location an Ap...
Cross Pose Facial Recognition Method for Tracking any Person's Location an Ap...
 

Similar to IJET-V2I6P21

Eye Gaze Tracking With a Web Camera in a Desktop Environment
Eye Gaze Tracking With a Web Camera in a Desktop EnvironmentEye Gaze Tracking With a Web Camera in a Desktop Environment
Eye Gaze Tracking With a Web Camera in a Desktop Environment1crore projects
 
An Eye State Recognition System using Transfer Learning: Inception‑Based Deep...
An Eye State Recognition System using Transfer Learning: Inception‑Based Deep...An Eye State Recognition System using Transfer Learning: Inception‑Based Deep...
An Eye State Recognition System using Transfer Learning: Inception‑Based Deep...IRJET Journal
 
Survey Paper on Eye Gaze Tracking Methods and Techniques
Survey Paper on Eye Gaze Tracking Methods and TechniquesSurvey Paper on Eye Gaze Tracking Methods and Techniques
Survey Paper on Eye Gaze Tracking Methods and TechniquesIRJET Journal
 
Eyetracking
EyetrackingEyetracking
EyetrackingMD Shafe
 
WebCam EyeTracker Accurycy test
WebCam EyeTracker Accurycy testWebCam EyeTracker Accurycy test
WebCam EyeTracker Accurycy testSzymon Deja
 
Eye tracking – an innovative monitor
Eye tracking – an innovative monitorEye tracking – an innovative monitor
Eye tracking – an innovative monitorSakthi Sivaraman S
 
AN ALGORITHM FOR AUTOMATICALLY DETECTING DYSLEXIA ON THE FLY
AN ALGORITHM FOR AUTOMATICALLY DETECTING DYSLEXIA ON THE FLYAN ALGORITHM FOR AUTOMATICALLY DETECTING DYSLEXIA ON THE FLY
AN ALGORITHM FOR AUTOMATICALLY DETECTING DYSLEXIA ON THE FLYijcsit
 
AN ALGORITHM FOR AUTOMATICALLY DETECTING DYSLEXIA ON THE FLY
AN ALGORITHM FOR AUTOMATICALLY DETECTING DYSLEXIA ON THE FLY AN ALGORITHM FOR AUTOMATICALLY DETECTING DYSLEXIA ON THE FLY
AN ALGORITHM FOR AUTOMATICALLY DETECTING DYSLEXIA ON THE FLY AIRCC Publishing Corporation
 
Faro An Interactive Interface For Remote Administration Of Clinical Tests Bas...
Faro An Interactive Interface For Remote Administration Of Clinical Tests Bas...Faro An Interactive Interface For Remote Administration Of Clinical Tests Bas...
Faro An Interactive Interface For Remote Administration Of Clinical Tests Bas...Kalle
 
Eye(I) Still Know! – An App for the Blind Built using Web and AI
Eye(I) Still Know! – An App for the Blind Built using Web and AIEye(I) Still Know! – An App for the Blind Built using Web and AI
Eye(I) Still Know! – An App for the Blind Built using Web and AIDr. Amarjeet Singh
 
An efficient system for real time fatigue detection
An efficient system for real time fatigue detectionAn efficient system for real time fatigue detection
An efficient system for real time fatigue detectionAlexander Decker
 
A Survey on Smart Devices for Object and Fall Detection
A Survey on Smart Devices for Object and Fall DetectionA Survey on Smart Devices for Object and Fall Detection
A Survey on Smart Devices for Object and Fall DetectionIRJET Journal
 
Analysis_of_Navigation_Assistants_for_Blind_and_Visually_Impaired_People_A_Sy...
Analysis_of_Navigation_Assistants_for_Blind_and_Visually_Impaired_People_A_Sy...Analysis_of_Navigation_Assistants_for_Blind_and_Visually_Impaired_People_A_Sy...
Analysis_of_Navigation_Assistants_for_Blind_and_Visually_Impaired_People_A_Sy...ssuser50a5ec
 
Eye tracking and detection by using fuzzy template matching and parameter bas...
Eye tracking and detection by using fuzzy template matching and parameter bas...Eye tracking and detection by using fuzzy template matching and parameter bas...
Eye tracking and detection by using fuzzy template matching and parameter bas...IAEME Publication
 
Eye tracking and detection by using fuzzy template matching and parameter bas...
Eye tracking and detection by using fuzzy template matching and parameter bas...Eye tracking and detection by using fuzzy template matching and parameter bas...
Eye tracking and detection by using fuzzy template matching and parameter bas...IAEME Publication
 
A Literature Review on Iris Segmentation Techniques for Iris Recognition Systems
A Literature Review on Iris Segmentation Techniques for Iris Recognition SystemsA Literature Review on Iris Segmentation Techniques for Iris Recognition Systems
A Literature Review on Iris Segmentation Techniques for Iris Recognition SystemsIOSR Journals
 

Similar to IJET-V2I6P21 (20)

Eye Gaze Tracking With a Web Camera in a Desktop Environment
Eye Gaze Tracking With a Web Camera in a Desktop EnvironmentEye Gaze Tracking With a Web Camera in a Desktop Environment
Eye Gaze Tracking With a Web Camera in a Desktop Environment
 
An Eye State Recognition System using Transfer Learning: Inception‑Based Deep...
An Eye State Recognition System using Transfer Learning: Inception‑Based Deep...An Eye State Recognition System using Transfer Learning: Inception‑Based Deep...
An Eye State Recognition System using Transfer Learning: Inception‑Based Deep...
 
V4 n2 139
V4 n2 139V4 n2 139
V4 n2 139
 
Survey Paper on Eye Gaze Tracking Methods and Techniques
Survey Paper on Eye Gaze Tracking Methods and TechniquesSurvey Paper on Eye Gaze Tracking Methods and Techniques
Survey Paper on Eye Gaze Tracking Methods and Techniques
 
Eyetracking
EyetrackingEyetracking
Eyetracking
 
F0932733
F0932733F0932733
F0932733
 
WebCam EyeTracker Accurycy test
WebCam EyeTracker Accurycy testWebCam EyeTracker Accurycy test
WebCam EyeTracker Accurycy test
 
Eye tracking – an innovative monitor
Eye tracking – an innovative monitorEye tracking – an innovative monitor
Eye tracking – an innovative monitor
 
AN ALGORITHM FOR AUTOMATICALLY DETECTING DYSLEXIA ON THE FLY
AN ALGORITHM FOR AUTOMATICALLY DETECTING DYSLEXIA ON THE FLYAN ALGORITHM FOR AUTOMATICALLY DETECTING DYSLEXIA ON THE FLY
AN ALGORITHM FOR AUTOMATICALLY DETECTING DYSLEXIA ON THE FLY
 
AN ALGORITHM FOR AUTOMATICALLY DETECTING DYSLEXIA ON THE FLY
AN ALGORITHM FOR AUTOMATICALLY DETECTING DYSLEXIA ON THE FLY AN ALGORITHM FOR AUTOMATICALLY DETECTING DYSLEXIA ON THE FLY
AN ALGORITHM FOR AUTOMATICALLY DETECTING DYSLEXIA ON THE FLY
 
Real Time Eye Gaze Tracking
Real Time Eye Gaze TrackingReal Time Eye Gaze Tracking
Real Time Eye Gaze Tracking
 
Faro An Interactive Interface For Remote Administration Of Clinical Tests Bas...
Faro An Interactive Interface For Remote Administration Of Clinical Tests Bas...Faro An Interactive Interface For Remote Administration Of Clinical Tests Bas...
Faro An Interactive Interface For Remote Administration Of Clinical Tests Bas...
 
Eye(I) Still Know! – An App for the Blind Built using Web and AI
Eye(I) Still Know! – An App for the Blind Built using Web and AIEye(I) Still Know! – An App for the Blind Built using Web and AI
Eye(I) Still Know! – An App for the Blind Built using Web and AI
 
An efficient system for real time fatigue detection
An efficient system for real time fatigue detectionAn efficient system for real time fatigue detection
An efficient system for real time fatigue detection
 
A Survey on Smart Devices for Object and Fall Detection
A Survey on Smart Devices for Object and Fall DetectionA Survey on Smart Devices for Object and Fall Detection
A Survey on Smart Devices for Object and Fall Detection
 
Analysis_of_Navigation_Assistants_for_Blind_and_Visually_Impaired_People_A_Sy...
Analysis_of_Navigation_Assistants_for_Blind_and_Visually_Impaired_People_A_Sy...Analysis_of_Navigation_Assistants_for_Blind_and_Visually_Impaired_People_A_Sy...
Analysis_of_Navigation_Assistants_for_Blind_and_Visually_Impaired_People_A_Sy...
 
Eye tracking and detection by using fuzzy template matching and parameter bas...
Eye tracking and detection by using fuzzy template matching and parameter bas...Eye tracking and detection by using fuzzy template matching and parameter bas...
Eye tracking and detection by using fuzzy template matching and parameter bas...
 
Eye tracking and detection by using fuzzy template matching and parameter bas...
Eye tracking and detection by using fuzzy template matching and parameter bas...Eye tracking and detection by using fuzzy template matching and parameter bas...
Eye tracking and detection by using fuzzy template matching and parameter bas...
 
G01114650
G01114650G01114650
G01114650
 
A Literature Review on Iris Segmentation Techniques for Iris Recognition Systems
A Literature Review on Iris Segmentation Techniques for Iris Recognition SystemsA Literature Review on Iris Segmentation Techniques for Iris Recognition Systems
A Literature Review on Iris Segmentation Techniques for Iris Recognition Systems
 

More from IJET - International Journal of Engineering and Techniques

More from IJET - International Journal of Engineering and Techniques (20)

healthcare supervising system to monitor heart rate to diagonize and alert he...
healthcare supervising system to monitor heart rate to diagonize and alert he...healthcare supervising system to monitor heart rate to diagonize and alert he...
healthcare supervising system to monitor heart rate to diagonize and alert he...
 
verifiable and multi-keyword searchable attribute-based encryption scheme for...
verifiable and multi-keyword searchable attribute-based encryption scheme for...verifiable and multi-keyword searchable attribute-based encryption scheme for...
verifiable and multi-keyword searchable attribute-based encryption scheme for...
 
Ijet v5 i6p18
Ijet v5 i6p18Ijet v5 i6p18
Ijet v5 i6p18
 
Ijet v5 i6p17
Ijet v5 i6p17Ijet v5 i6p17
Ijet v5 i6p17
 
Ijet v5 i6p16
Ijet v5 i6p16Ijet v5 i6p16
Ijet v5 i6p16
 
Ijet v5 i6p15
Ijet v5 i6p15Ijet v5 i6p15
Ijet v5 i6p15
 
Ijet v5 i6p14
Ijet v5 i6p14Ijet v5 i6p14
Ijet v5 i6p14
 
Ijet v5 i6p13
Ijet v5 i6p13Ijet v5 i6p13
Ijet v5 i6p13
 
Ijet v5 i6p12
Ijet v5 i6p12Ijet v5 i6p12
Ijet v5 i6p12
 
Ijet v5 i6p11
Ijet v5 i6p11Ijet v5 i6p11
Ijet v5 i6p11
 
Ijet v5 i6p10
Ijet v5 i6p10Ijet v5 i6p10
Ijet v5 i6p10
 
Ijet v5 i6p2
Ijet v5 i6p2Ijet v5 i6p2
Ijet v5 i6p2
 
IJET-V3I2P24
IJET-V3I2P24IJET-V3I2P24
IJET-V3I2P24
 
IJET-V3I2P23
IJET-V3I2P23IJET-V3I2P23
IJET-V3I2P23
 
IJET-V3I2P22
IJET-V3I2P22IJET-V3I2P22
IJET-V3I2P22
 
IJET-V3I2P21
IJET-V3I2P21IJET-V3I2P21
IJET-V3I2P21
 
IJET-V3I2P20
IJET-V3I2P20IJET-V3I2P20
IJET-V3I2P20
 
IJET-V3I2P19
IJET-V3I2P19IJET-V3I2P19
IJET-V3I2P19
 
IJET-V3I2P18
IJET-V3I2P18IJET-V3I2P18
IJET-V3I2P18
 
IJET-V3I2P17
IJET-V3I2P17IJET-V3I2P17
IJET-V3I2P17
 

Recently uploaded

8086 Microprocessor Architecture: 16-bit microprocessor
8086 Microprocessor Architecture: 16-bit microprocessor8086 Microprocessor Architecture: 16-bit microprocessor
8086 Microprocessor Architecture: 16-bit microprocessorAshwiniTodkar4
 
NO1 Top No1 Amil Baba In Azad Kashmir, Kashmir Black Magic Specialist Expert ...
NO1 Top No1 Amil Baba In Azad Kashmir, Kashmir Black Magic Specialist Expert ...NO1 Top No1 Amil Baba In Azad Kashmir, Kashmir Black Magic Specialist Expert ...
NO1 Top No1 Amil Baba In Azad Kashmir, Kashmir Black Magic Specialist Expert ...Amil baba
 
S1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptx
S1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptxS1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptx
S1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptxSCMS School of Architecture
 
Online food ordering system project report.pdf
Online food ordering system project report.pdfOnline food ordering system project report.pdf
Online food ordering system project report.pdfKamal Acharya
 
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXssuser89054b
 
PE 459 LECTURE 2- natural gas basic concepts and properties
PE 459 LECTURE 2- natural gas basic concepts and propertiesPE 459 LECTURE 2- natural gas basic concepts and properties
PE 459 LECTURE 2- natural gas basic concepts and propertiessarkmank1
 
8th International Conference on Soft Computing, Mathematics and Control (SMC ...
8th International Conference on Soft Computing, Mathematics and Control (SMC ...8th International Conference on Soft Computing, Mathematics and Control (SMC ...
8th International Conference on Soft Computing, Mathematics and Control (SMC ...josephjonse
 
School management system project Report.pdf
School management system project Report.pdfSchool management system project Report.pdf
School management system project Report.pdfKamal Acharya
 
Computer Networks Basics of Network Devices
Computer Networks  Basics of Network DevicesComputer Networks  Basics of Network Devices
Computer Networks Basics of Network DevicesChandrakantDivate1
 
HOA1&2 - Module 3 - PREHISTORCI ARCHITECTURE OF KERALA.pptx
HOA1&2 - Module 3 - PREHISTORCI ARCHITECTURE OF KERALA.pptxHOA1&2 - Module 3 - PREHISTORCI ARCHITECTURE OF KERALA.pptx
HOA1&2 - Module 3 - PREHISTORCI ARCHITECTURE OF KERALA.pptxSCMS School of Architecture
 
Basic Electronics for diploma students as per technical education Kerala Syll...
Basic Electronics for diploma students as per technical education Kerala Syll...Basic Electronics for diploma students as per technical education Kerala Syll...
Basic Electronics for diploma students as per technical education Kerala Syll...ppkakm
 
fitting shop and tools used in fitting shop .ppt
fitting shop and tools used in fitting shop .pptfitting shop and tools used in fitting shop .ppt
fitting shop and tools used in fitting shop .pptAfnanAhmad53
 
Hostel management system project report..pdf
Hostel management system project report..pdfHostel management system project report..pdf
Hostel management system project report..pdfKamal Acharya
 
Theory of Time 2024 (Universal Theory for Everything)
Theory of Time 2024 (Universal Theory for Everything)Theory of Time 2024 (Universal Theory for Everything)
Theory of Time 2024 (Universal Theory for Everything)Ramkumar k
 
Introduction to Serverless with AWS Lambda
Introduction to Serverless with AWS LambdaIntroduction to Serverless with AWS Lambda
Introduction to Serverless with AWS LambdaOmar Fathy
 
Path loss model, OKUMURA Model, Hata Model
Path loss model, OKUMURA Model, Hata ModelPath loss model, OKUMURA Model, Hata Model
Path loss model, OKUMURA Model, Hata ModelDrAjayKumarYadav4
 
COST-EFFETIVE and Energy Efficient BUILDINGS ptx
COST-EFFETIVE  and Energy Efficient BUILDINGS ptxCOST-EFFETIVE  and Energy Efficient BUILDINGS ptx
COST-EFFETIVE and Energy Efficient BUILDINGS ptxJIT KUMAR GUPTA
 
Max. shear stress theory-Maximum Shear Stress Theory ​ Maximum Distortional ...
Max. shear stress theory-Maximum Shear Stress Theory ​  Maximum Distortional ...Max. shear stress theory-Maximum Shear Stress Theory ​  Maximum Distortional ...
Max. shear stress theory-Maximum Shear Stress Theory ​ Maximum Distortional ...ronahami
 

Recently uploaded (20)

Signal Processing and Linear System Analysis
Signal Processing and Linear System AnalysisSignal Processing and Linear System Analysis
Signal Processing and Linear System Analysis
 
8086 Microprocessor Architecture: 16-bit microprocessor
8086 Microprocessor Architecture: 16-bit microprocessor8086 Microprocessor Architecture: 16-bit microprocessor
8086 Microprocessor Architecture: 16-bit microprocessor
 
NO1 Top No1 Amil Baba In Azad Kashmir, Kashmir Black Magic Specialist Expert ...
NO1 Top No1 Amil Baba In Azad Kashmir, Kashmir Black Magic Specialist Expert ...NO1 Top No1 Amil Baba In Azad Kashmir, Kashmir Black Magic Specialist Expert ...
NO1 Top No1 Amil Baba In Azad Kashmir, Kashmir Black Magic Specialist Expert ...
 
S1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptx
S1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptxS1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptx
S1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptx
 
Online food ordering system project report.pdf
Online food ordering system project report.pdfOnline food ordering system project report.pdf
Online food ordering system project report.pdf
 
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
 
PE 459 LECTURE 2- natural gas basic concepts and properties
PE 459 LECTURE 2- natural gas basic concepts and propertiesPE 459 LECTURE 2- natural gas basic concepts and properties
PE 459 LECTURE 2- natural gas basic concepts and properties
 
8th International Conference on Soft Computing, Mathematics and Control (SMC ...
8th International Conference on Soft Computing, Mathematics and Control (SMC ...8th International Conference on Soft Computing, Mathematics and Control (SMC ...
8th International Conference on Soft Computing, Mathematics and Control (SMC ...
 
School management system project Report.pdf
School management system project Report.pdfSchool management system project Report.pdf
School management system project Report.pdf
 
Computer Networks Basics of Network Devices
Computer Networks  Basics of Network DevicesComputer Networks  Basics of Network Devices
Computer Networks Basics of Network Devices
 
HOA1&2 - Module 3 - PREHISTORCI ARCHITECTURE OF KERALA.pptx
HOA1&2 - Module 3 - PREHISTORCI ARCHITECTURE OF KERALA.pptxHOA1&2 - Module 3 - PREHISTORCI ARCHITECTURE OF KERALA.pptx
HOA1&2 - Module 3 - PREHISTORCI ARCHITECTURE OF KERALA.pptx
 
Basic Electronics for diploma students as per technical education Kerala Syll...
Basic Electronics for diploma students as per technical education Kerala Syll...Basic Electronics for diploma students as per technical education Kerala Syll...
Basic Electronics for diploma students as per technical education Kerala Syll...
 
fitting shop and tools used in fitting shop .ppt
fitting shop and tools used in fitting shop .pptfitting shop and tools used in fitting shop .ppt
fitting shop and tools used in fitting shop .ppt
 
Integrated Test Rig For HTFE-25 - Neometrix
Integrated Test Rig For HTFE-25 - NeometrixIntegrated Test Rig For HTFE-25 - Neometrix
Integrated Test Rig For HTFE-25 - Neometrix
 
Hostel management system project report..pdf
Hostel management system project report..pdfHostel management system project report..pdf
Hostel management system project report..pdf
 
Theory of Time 2024 (Universal Theory for Everything)
Theory of Time 2024 (Universal Theory for Everything)Theory of Time 2024 (Universal Theory for Everything)
Theory of Time 2024 (Universal Theory for Everything)
 
Introduction to Serverless with AWS Lambda
Introduction to Serverless with AWS LambdaIntroduction to Serverless with AWS Lambda
Introduction to Serverless with AWS Lambda
 
Path loss model, OKUMURA Model, Hata Model
Path loss model, OKUMURA Model, Hata ModelPath loss model, OKUMURA Model, Hata Model
Path loss model, OKUMURA Model, Hata Model
 
COST-EFFETIVE and Energy Efficient BUILDINGS ptx
COST-EFFETIVE  and Energy Efficient BUILDINGS ptxCOST-EFFETIVE  and Energy Efficient BUILDINGS ptx
COST-EFFETIVE and Energy Efficient BUILDINGS ptx
 
Max. shear stress theory-Maximum Shear Stress Theory ​ Maximum Distortional ...
Max. shear stress theory-Maximum Shear Stress Theory ​  Maximum Distortional ...Max. shear stress theory-Maximum Shear Stress Theory ​  Maximum Distortional ...
Max. shear stress theory-Maximum Shear Stress Theory ​ Maximum Distortional ...
 

IJET-V2I6P21

  • 1. International Journal of Engineering and Techniques - Volume 2 Issue 6, Nov – Dec 2016 ISSN: 2395-1303 http://www.ijetjournal.org Page 140 Real Time Implementation of Eye Tracking System Using Arduino Uno Based Hardware Interface B.K. Venugopal1 , Dilson D’souza2 1 Associate professor 2 Master of Engineering Dept. of Electronics and Communication, University Visvesvaraya College of Engineering, Bangalore 1. INTRODUCTION 1. Wireless LAN Medium Access Control (MAC) and Eye tracking is one of the significant way towards measuring either the purpose of look (where one is looking) or the movement of an eye with respect to the head. An eye tracker is a gadget for measuring eye positions and eye development. Eye trackers are utilized as a part of exploration on the visual framework, in brain research, in psycholinguistics, showcasing, as an information gadget for human- PC cooperation, and in item plan. There are various techniques for measuring eye development. The most mainstream variation utilizes video pictures from which the eye position is extricated. Different techniques use look curls or depend on the electrooculogram. The most broadly utilized current plans are video- based eye trackers. A camera concentrates on one or both eyes and records their development as the viewer takes a consideration at some sort of boost. Most present day eye-trackers utilize the focal point of the pupil and infrared/close infrared non- collimated light to make corneal reflections (CR). The vector between the understudy focus and the corneal reflections can be utilized to process the purpose of respect on surface or the look heading. A basic alignment technique of the individual is generally required before utilizing the eye tracker. Two general sorts of infrared/close infrared (otherwise called dynamic light) eye following procedures are utilized: enhanced pupil and dim pupil. Their distinction depends on the area of the light source as for the optics. On the off chance that the luminance is coaxial with the optical way, then the eye goes about as a retroreflector as the light reflects off the retina making a splendid student impact like red eye. In the event that the light source is balanced from the optical way, then the student seems dull on the grounds that the retroreflection from the retina is coordinated far from the camera. Splendid pupil following makes more noteworthy iris/pupil contrast, permitting more vigorous eye following with all iris pigmentation, and incredibly diminishes obstruction brought about by eyelashes and other darkening features. It likewise permits following in lighting conditions going from aggregate haziness to brilliant. Yet, splendid understudy methods are not powerful to RESEARCH ARTICLE OPEN ACCESS Abstract: Eye tracking system has played a significant role in many of today’s applications ranging from military applications to automotive industries and healthcare sectors. In this paper, a novel system for eye tracking and estimation of its direction of movement is performed. The proposed system is implemented in real time using an arduino uno microcontroller and a zigbee wireless device. Experimental results show a successful eye tracking and movement estimation in real time scenario using the proposed hardware interface. Keywords — Arduino based hardware, eye tracking system, Binarization, eye region classification, Hough transform, Viola-Jones algorithm and zigbee wireless device.
  • 2. International Journal of Engineering and Techniques - Volume 2 Issue 6, Nov – Dec 2016 ISSN: 2395-1303 http://www.ijetjournal.org Page 141 track outside, as unessential IR sources meddle with monitoring. Another, less utilized, strategy known to be an inactive light. It utilizes the noticeable light to luminance, which may make a few diversions users. Another test with this strategy is the differentiation of the understudy is not exactly in the dynamic light strategies, accordingly, the focal point of iris is utilized for figuring the vector instead. This count needs to distinguish the limit of iris and the white sclera (limbus following). It introduces another test for vertical eye developments because of block of eyelids. The following sections in this paper are as follows. The first section gives a brief introduction concerning the significance of the area of interest and the general problems associated with it. A review of literature and related works is given in section 2. The proposed system and implementation concerning the architecture and work flow of the project is given in section 3 and 4 respectively. The obtained results is given in section 5 and finally the conclusion of the overall work along with references is given. 2. LITERATURE SURVEY AND RELATED WORKS The application which extensively uses the concept of eye tracking include sectors in automotive industries, medical research, Fatigue simulation, Vehicle simulators, cognitive studies, computer vision, activity recognition, etc. The significance of eye detection and tracking in commercial applications has increased over a period of time. This significance of eye tracking applications lead to more efficient and robust designs which is necessary in many of today’s modern devices. An extensive review of literature has been done in the field of healthcare applications concerning the eye tracking system. Some of these methods are mentioned below. Z. Sharafi et. al [1] performed an evaluation in the metrics concerning the eye tracking in the software engineering. The author brings about different metrics concerning the eye tracking into one common standard to facililate researchers facing difficulty in analyzing the eye- tracking experiment. The work also produces definitions on different metrics used along with suggestions of using the same from other related fields. S. Chandra et. al [2] proposed an application based on eye-tracking and their respective interaction with human beings. The work mainly focuses on the determination of direction of gaze along with their position and their and sequence of movement. The work focuses on three aspects in the eye-tracking experiment. The first objective was to provide the user an insight into the underlying issues present in the eye-movement technology. The second objective was to provide the user a guidance concerning the development of the eye-movement technology. The third objective was to recognize the challenges and prospects in building a Man And Machine Interfacing (MAMI) systems using the principle of eye-tracking. Experimental results show a reduced computational time in with respect to gaze input as compared to mouse input. Y. Zhang et. al [3] performed a comparative analysis on the stationary and mobile eye tracking wayfinding system considering the EXITs design. A mobile tracking system was used to detect the eye movement where the EXIT design was used in the building. A stationary eye- tracking technique was used for the same purpose. A comparative analysis was performed which resulted in some conclusions which was primarily procured from elements of appearnece, design and placement of the EXIT on the building. Empirical methodologies were introduced in the field of research with respect to wayfinding and space decision. D. Miyamoto et. al [4] proposed a method for the enforcement of phishing prevention habits using the eye tracking technique. The experiment was performed where the eye movement of 23 participants were analyzed and also considered that the novice participant did not tend to possess a similar habit. A prototype named eyebit was developed which required the participant to look in the address bar and consequently enter some address in the web forms. The system was designed such that it deactivates and reactivates (control parameter) based on the participant eye movement pattern. Experimental results show that the significance in effectiveness of the proposal which
  • 3. International Journal of Engineering and Techniques - Volume 2 Issue 6, Nov – Dec 2016 ISSN: 2395-1303 http://www.ijetjournal.org Page 142 was based on the participant eye movement prediction even though an incovineance was observed in the Eyebit system. R. G Bozomitu et. al [5] proposed an eye tracking system based on the circular Hough transform algorithm. The experiment was aimed towards producing benefits to the neuromotor disabled patients. The signals were captured using an infrared video camera along with a pc. The process of eye tracking system implemented in this work uses a keyword technology. Experimental results show that the optimal performance of the pupil eye detection movement was based on a trade-off between the algorithms computational time and the precision of detection of the pupil region in the eye- movement. R. G. Bozomito et. al [6] performed a comparative analysis on the eye tracking applications with respect to its pupil. The experiment was a comparison between two eye-movement detecting algorithms mainly the circular Hough transform and the Starbust transform. A parameter based algorithm was implemented in the circular hough transform and a feature based algorithm was implemented in the in the Starbust transform. In order to improve the curser stability a Gaussian smoothing filter along with appropriate spike removal technique was implemented. Experimental results showed that the Starbust based transformation had a higher accuracy as compared to circular hough based transformation with respect to cursor movement, but due to the notably high disparity of the pupil center position in successive frames the Starbust based transformation had high levels of noise. O. Mazmar et. al [7] proposed a eye-ball tracking system which was performed in a real time environment. The experiment was aimed towards people having disabled neuromotor systems. The experiment was conducted by developing a human computer interface facilitating the neurological control of the disabled patients by observing only the eye-movement of the patients. A serial communication was performed which sent data from the webcam to a MATALB based simulating tool. A segmentation process followed by a centroid calculation for the pupil was performed which results in producing a control signal from the patient. The conducted experiment resulted in a real time simulation of an eye tracking system. M. Kim et. al [8] proposed a system of eye movement detection and tracking based on the cardinal direction. The experiment was conducted in such a way that the participants were asked to look towards north in a given cardinal direction and the respective reaction time for each cardinal direction was observed. The experimental results demonstrates that the participant’s eye is fixed on the character part, corner of shape and the cross over point. The cardinal direction was observed to be small in range considering the gaze movement. A significant observation proclaimed in this experiment was that the clarity of the cardinal direction confirmed the presence or absence of the character information. C. Jin et. al [9] proposed a technique based on the gaze point compensation technique using the eye tracking system. The work proposes a length compensation algorithm by considering the gaze point algorithm. Experimental result showed a deviation of less than 1 cm in both the vertical and horizontal directions. Z. Zhang et. al [10] proposed a classification method to separate the novice from experienced drivers considering their reflect drivers attention and their skill. The classifier used in this experiment is the binary Gaussian classifier considering a two dimensional data which is obtained from the gaze behavior from the participants. A comparative analysis between the Gaussian based classifier and the Gaussian mixture models were performed. Experimental results show that the performance of the proposed Gaussian based classifier was higher in efficiency as compared to the Gaussian mixture models. F. B Taher et. al [11] proposed a controller based electric powered wheelchair (EPW) system for certain types of disabled persons based on a combination of EEG (Electroencephalogram) signals with that of the eye tracking system. The proposed technique illustrate the significance of using a multi source control process in the EPW. The first part involves elaboration of separate control techniques involving EEG and the eye tracking system. The second part involves a combination of the previous two techniques by
  • 4. International Journal of Engineering and Techniques - Volume 2 Issue 6, Nov – Dec 2016 ISSN: 2395-1303 http://www.ijetjournal.org Page 143 considering a data fusion algorithm. Experimental results show that by considering only EEG based control signal without the inclusion of eye tracking system, the result was much higher in performance, but the use of EEG devices had a disadvantage of limited energy. Hence by implementing the proposed system this limitation was resolved. K. Kurzhals et. al [12] proposed an eye tracking system based on computer visualization system. This work mainly deals with the development of tri- dimensional fly through animation system. The technique portrayed an accurate representation of the distribution of the galaxy which was visually and aesthetically pleasing to the observer. F. Zhou et. al [13] proposed an eye tracking system which was based on the particle filtering algorithm. First an AdaBoost based classifier was implemented to procure the information of the eye region. Performing a comparison of two consecutive frames in a video sequence the eye region information is seeked, for this purpose the particle filtering algorithm was implemented. Experimental result shows an improvement in the computational time between consecutive frames due to the reduced search region for the human eye. K. Kuzhals et. al [14] proposed an eye tracking mechanism primarily for the application involving Personal Visual Analytics (VSA). This work primarily explains the challenges which arises in real time scenarios in context of VSA along with the prospective of potential research in the respective application. The author also presents a technique for representing the area of interest considering multiple videos 3. PROPOSED SYSTEM The purpose of the proposed system is to perform detection and tracking of the pupil region of the eye. The application of this system could be found in areas ranging from military applications to healthcare sectors. The proposed system mainly consists of three general modules. The first module is where the video is captured in real time and the eye detection is performed using the Viola-jones algorithm. The second module is used to perform binarization and application of Hough transformation to detect the pupil region of the eye. The third module is used to perform the detection and tracking of the movement of the pupil region of the eye and to set a direction respectively. The obtained direction is then sent to a hardware experimental setup consisting of an Arduino uno microcontroller with a wireless zigbee device for wireless transmission of data. The general architecture for the proposed system is given in fig 1 as shown below. The purpose of the proposed system is the detection, identification and tracking of the pupil region of the eye in real time scenario. The video is captured from the image capturing device and frames are extracted, for each frame the three module defined in the previous section is performed. The resulting direction data of the movement of the pupil region of the eye is sent to the hardware experimental setup as control signal. The functionalities associated for the proposed eye tracking system are basically divided into three modules which consists of image pre- processing, pupil region detection and pupil detection and movement detection. 4. IMPLEMENTATION The procedure for the implementation for the eye tracking system is given as follows. In the first module of image pre-preprocessing, the video is captured from an image capturing device, the respect frame is extracted from the video, the image which is in the RGB format is converted to a single dimension gray scale format. Gamma correction is applied to the image to set the image to an ideal brightness and contrast. The gamma correction is computed by controlling a parameter called the Gamma factor. The eye region is detected using the viola Jones algorithm, In this algorithm the features are extracted using the haar wavelet transformation, An integral image is created and the corresponding training is performed suing the AdaBoost training method and finally the cascading classifiers are used for the detection of the eye region in the face. The threshold is set for which the pixel value of the image will be zero value for value below the threshold and the pixel value will be 1 if the value is above the threshold. The user adjusts this setting until the pupil region is clearly identified and defined in the eye. These parameters is then to Binarization process.
  • 5. International Journal of Engineering and Techniques - Volume 2 Issue 6, Nov – Dec 2016 ISSN: 2395-1303 http://www.ijetjournal.org Page 144 Fig 1: proposed system The second module consists of the definition of the pupil region of the proposed system, initially a minimum and maximum radii are defined for which the values (or radii value) identified within this range will be termed as circle. The set parameters (i.e the two radii) are then sent to the Hough Transformation which detects and defines the circular region in the eye. (fig. 3) The third module consists of the detection and movement of the pupil region in the eye. A reference point is set to the pupil region which consists of a matrix having radii and centers as rows and columns. A comparative analysis is performed with respect to previous frame and the current frame considering the pixel position of the pupil region in the eye. The condition given below are set for the direction of movement of the vehicle based pupil movement. (fig.4) Condition 1: if rows of current frame is greater than the rows of previous frame, then move ‘RIGHT’ Condition 2: if rows of current frame is lesser than the rows of previous frame, then move ‘LEFT’ Condition 3: if columns of current frame is greater than the columns of previous frame, then move ‘FORWARD’ Condition 4: if columns of current frame is lesser than the columns of previous frame, then ‘STOP’ The hardware experimental setup is represented as shown in fig. 5. . Fig 2: proposed methodology for eye detection and binarization Fig 3: proposed methodology for Hough Transformation Fig 4: proposed methodology for eye movement detection
  • 6. International Journal of Engineering and Techniques ISSN: 2395-1303 Fig 5: Hardware experimental setup 5. SIMULATION RESULTS AND PARAMETRIC EVALUATIONS The evaluation performed with respect to image forgery detection is mentioned as follo Database Table 1: database for the proposed eye tracking system Sl.no Parameters Description (value) 1. Video Device Integrated webcam 2. Video format mjpg 1024X768 3. Returned colour space RGB 4. trigger Infinite 5. Frames/trigger 1 6. n Number of frames (126) 7. Image resolution 1024X768X3 8. Number of frames 126 Parametric analysis Table 2: parametric analysis for the proposed eye tracking system Sl.no Parameters Description 1. Video Device webcam 2. n Number of frames 3. g Gamma factor 4. th Threshold value for binarization 5. min Minimum radius for pupil 6. max Maximum radius for pupil 7. dir Direction of pupil movement International Journal of Engineering and Techniques - Volume 2 Issue 6, Nov – http://www.ijetjournal.org The evaluation performed with respect to image forgery detection is mentioned as follows, Table 1: database for the proposed eye tracking Description (value) Integrated webcam mjpg 1024X768 Number of frames 1024X768X3 Table 2: parametric analysis for the proposed Number of frames Gamma factor Threshold value for Minimum radius for pupil Maximum radius for pupil Direction of pupil 1. Mean Square Error (MSE): defined as measurement of average of square of the error or deviation between the obtained and reference data. It is mathematically defined as follows, ∑ ………………… (1) Where,  reference data  Estimated data 2. Peak signal to noise ratio is defined as the log of ratio of power of noise to its respective noise in the image. It is mathematically defined as, 10 log ……………… Where,  maximum numb the image 3. Sensitivity: Sensitivity is defined as the ratio of true positive to sum of true positive and false negative. It is mathematically defined as shown in eq.3, Sensitivity = TP/ (TP + FN) Where, FN  False negative TP  True Positive 4. Specificity: It is defined as the ratio of true negative to sum of true negative and false positive. It is mathematically represented as shown in eq.5, Specificity = TN/ (TN + FP) Where, TN  True Negative FP  False Positive Table 3: Experimental results parameters Elapsed time PSNR TP TN FP FN – Dec 2016 Page 145 Mean Square Error (MSE): The MSE is defined as measurement of average of square of the error or deviation between the obtained and reference data. It is mathematically defined as ………………… (1) (PSNR): The PSNR is defined as the log of ratio of power of noise to its respective noise in the image. It is ……………… (2) maximum number of pixels in : Sensitivity is defined as the ratio of true positive to sum of true positive and false negative. It is mathematically defined as shown Sensitivity = TP/ (TP + FN) …………… (3) False negative : It is defined as the ratio of true negative to sum of true negative and false positive. It is mathematically represented as Specificity = TN/ (TN + FP) …………… (4) True Negative False Positive Table 3: Experimental results image 33.82 s 14.2 (avg.) 65/126 30/126 20/126 11/126
  • 7. International Journal of Engineering and Techniques - Volume 2 Issue 6, Nov – Dec 2016 ISSN: 2395-1303 http://www.ijetjournal.org Page 146 6. CONCLUSION For the experiment considering the proposed eye tracking system, a successful interface between the computer and the hardware was achieved. Binarization and Hough transformation was performed for the successful detection, identification and tracking of the pupil region of the eye. The hardware setup was built using an Arduino uno microcontroller and zigbee wireless device. Experimental result show a successful detection and tracking (with respect to direction) of the pupil region of the eye which is serially connected to the hardware interface as a control signal. REFERENCES [1] Sharafi, Zohreh, Timothy Shaffer, and Bonita Sharif. "Eye-Tracking Metrics in Software Engineering." In 2015 Asia- Pacific Software Engineering Conference (APSEC), pp. 96-103. IEEE, 2015. [2] Chandra, S., Sharma, G., Malhotra, S., Jha, D. and Mittal, A.P., 2015, December. Eye tracking based human computer interaction: Applications and their uses. In 2015 International Conference on Man and Machine Interfacing (MAMI) (pp. 1-5). IEEE. [3] Zhang, Y., Zheng, X., Hong, W. and Mou, X., 2015, December. A comparison study of stationary and mobile eye tracking on EXITs design in a wayfinding system. In 2015 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA) (pp. 649-653). IEEE. [4] Miyamoto, D., Iimura, T., Blanc, G., Tazaki, H. and Kadobayashi, Y., 2014, September. EyeBit: eye-tracking approach for enforcing phishing prevention habits. In 2014 Third International Workshop on Building Analysis Datasets and Gathering Experience Returns for Security (BADGERS) (pp. 56-65). IEEE. [5] Bozomitu, R.G., Păsărică, A., Cehan, V., Lupu, R.G., Rotariu, C. and Coca, E., 2015, November. Implementation of eye-tracking system based on circular Hough transform algorithm. In E-Health and Bioengineering Conference (EHB), 2015 (pp. 1-4). IEEE. [6] Păsărică, A., Bozomitu, R.G., Cehan, V., Lupu, R.G. and Rotariu, C., 2015, October. Pupil detection algorithms for eye tracking applications. In Design and Technology in Electronic Packaging (SIITME), 2015 IEEE 21st International Symposium for (pp. 161- 164). IEEE. [7] Mazhar, O., Shah, T.A., Khan, M.A. and Tehami, S., 2015, October. A real-time webcam based Eye Ball Tracking System using MATLAB. In Design and Technology in Electronic Packaging (SIITME), 2015 IEEE 21st International Symposium for (pp. 139-142). IEEE. [8] Kim, M., Morimoto, K. and Kuwahara, N., 2015, July. Using Eye Tracking to Investigate Understandability of Cardinal Direction. In Applied Computing and Information Technology/2nd International Conference on Computational Science and Intelligence (ACIT-CSI), 2015 3rd International Conference on(pp. 201-206). IEEE. [9] Jin, C. and Li, Y., 2015, August. Research of Gaze Point Compensation Method in Eye Tracking System. In Intelligent Human- Machine Systems and Cybernetics (IHMSC), 2015 7th International Conference on (Vol. 2, pp. 12-15). IEEE. [10] Zhang, Z., Kubo, T., Watanabe, J., Shibata, T., Ikeda, K., Bando, T., Hitomi, K. and Egawa, M., 2015, July. A classification method between novice and experienced drivers using eye tracking data and Gaussian process classifier. In Society of Instrument and Control Engineers of Japan (SICE), 2015 54th Annual Conference of the (pp. 1409-1412). IEEE. [11] Taher, F.B., Amor, N.B. and Jallouli, M., 2015, September. A multimodal wheelchair control system based on EEG signals and Eye tracking fusion. InInnovations in Intelligent SysTems and Applications (INISTA), 2015 International Symposium on (pp. 1-8). IEEE.
  • 8. International Journal of Engineering and Techniques - Volume 2 Issue 6, Nov – Dec 2016 ISSN: 2395-1303 http://www.ijetjournal.org Page 147 [12] Kurzhals, K., Burch, M., Pfeiffer, T. and Weiskopf, D., 2015. Eye Tracking in Computer-Based Visualization. Computing in Science & Engineering, 17(5), pp.64-71. [13] Zhou, F., Chen, W. and Fang, H., 2014, November. Robust eye tracking and location method based on Particle filtering algorithm. In 2014 IEEE 3rd International Conference on Cloud Computing and Intelligence Systems (pp. 247-252). IEEE. [14] Kurzhals, K. and Weiskopf, D., 2015. Eye Tracking for Personal Visual Analytics. IEEE computer graphics and applications, 35(4), pp.64-72.