SlideShare a Scribd company logo
1 of 4
Download to read offline
IJSRD - International Journal for Scientific Research & Development| Vol. 1, Issue 7, 2013 | ISSN (online): 2321-0613
All rights reserved by www.ijsrd.com 1452
Abstract— Around the globe, death is a daily occurrence
mainly due to accidents. Research has been conducted
intensively to attempt reduce accidents and extemporize the
Driver Assistance System. The core idea for this paper is
depicted through a process evolved to enhance effectively
the Intelligent Driver assistance system and also a safety
system to access the driver’s perspectives with vehicles. The
system uses a dynamic CCD camera in the vehicle that
observes the driver’s face. A prototype to match the
approach is used to compare the Driver’s eye pattern with a
set of existing templates of the driver gazing at various focal
points inside the vehicle. The windscreen is further divided
into segments and a comparison of the driver’s eye gaze
pattern with the existing stencil determines the driver’s view
point on the windscreen. For instance, in case the driver is
detected to be drowsy with closed eyelids for more than a
few seconds then he will be alerted automatically.
I. Introduction
The increasing number of accidents is proportional to the
driver’s take on the attributed monitoring. This corresponds
to the Vigilance level. The lower the monitoring the greater
the shift from healthy is driving to hazardous outcomes.
This vigilance level affects the perceptions, attitudes and
control abilities thereby posing a great threat to themselves
and the people around them. Hence, developing a module to
constantly watch the driver and incessantly alert him in case
of any breach will effectively reduce the number of
accidents. In correspondence to this regard, voluminous
efforts have been made to developing an active system that
ensures safety and effectively reduce the accident levels.
The classification for stupor in drivers is as follows:-
1 Sensing of physiological characteristics.
2 sensing of driver operation
3 Sensing of vehicle response.
4 Monitoring the response of driver
Accidents or Road hassles are a major problem in this fast
pacing world of today. The Global Status Report on Road
safety published in 2013, a statistics from 182 countries ,
accounting for 99% of the world population indicates that
total number of deaths in accidents remain as high as 1.24
million per year. Only 7 % of the total population has
comprehensive and effective road safety rules enforced.
Vehicle congestion, road infrastructure and social
interaction while driving leads to traffic around the globe.
And unfortunately humans tend to be the main source and
resource for road accidents. 99% of the accidents occur due
to err of the common man. The first three categories consist
of both passive and active components. Behavior alteration
or attitude amendment is due to information, education,
driving instruction and enforcement in the dominion of
active safety. The marginal cause can also pertain to
government’s enforcement that do not last long. This paper
elucidates on the fact that eye-gesture templates matched to
the driver’s eye to determine the driver’s viewpoint on the
windscreen. Research indicates that gesture recognition
techniques can in fact determine the perspective as of the
driver with accuracy that suffices to prevent hazardous
events.
II. EYE GESTURES
A. Drowsiness
Visual cues like yawn frequency, blinking of the eye, gaze
rate, head movements and nonverbal expressions aids detect
drowsiness in the driver. Integral aspects are considered to
make an efficient observation and stimulated response based
on the integral image, Ada Boost technique and cascade
classifier.[2] Followed by using the region of interest on the
face thereby minimizing the search area for finding the eyes.
Anthropometric property techniques are implied to detect
the eyes and the exact position is detected by collaborating
information from the grey level pixels. This is followed by a
mathematical equation to generate random samples of the
eyes indicating the exact state of the eye ( whether opened
or closed). In order to identify whether they eyes being
closed or opened the classification technique (support
Vector Machine) is used. The drowsiness is detected bu the
percentage of eye closed and this eye position generated
helps determine the driver’s face orientation. Thereby,
initiating an alert if the driver’s eyes are closed
Fig. 1: classification technique
B. Driver Attention
Facial features and eyelid movements are the two most
important factors used to asses poor driver behavior. A face
model that is anthropometrical is used to identify prominent
facial features. The image is made robust using few image
processing techniques. Effectively to identify facial features
from different angles of head pose the Kalman filter is used,
which is detailed to have a technique that detects the face by
the skin pixels on the image. The color skin histogram
technique that used to obtain samples from various scenarios
chooses the skin region manually and computes a
normalized skin color histogram to validate the RGB-space.
Eye Gesture Analysis for Prevention of Road Accidents
Nitin Kumar Vashisth1
Md. Zubair. I2
1,2
Computer Science and Engineering
1,2
KCG College of Technology
Eye Gesture Analysis for Prevention of Road Accidents
(IJSRD/Vol. 1/Issue 7/2013/0018)
All rights reserved by www.ijsrd.com 1453
Information such as percentage and duration of the eye
closure, nodding and blinking frequency, face orientation
and eye gaze is obtained in order to measure the driver
drowsiness from PERCLOS.[2] The device is developed by
Seeing Machine which is comprised three drowsiness
metrics. They are proportion of time the eyes were closed at
least 70%; the time the eyes were closed at least 80% and
EYEMEAS (EM) where mean square percentage of eye
closure rating measured. Using a fuzzy classifier the
parameters are fused together to segregate the level of
driver’s negligence plus to increase the performance of
detection.
C. Driver Distraction
Driver distraction occurs due to various reasons during the
driving period. It can be either picking up the phone, texting
or other distractions like music, PDA devices etc. A few
techniques have been employed to assure total attention of
the driver and indicate distraction. This information is
composed based on eye gaze, lips detection, head position
etc. Driver’s distraction was deduced by researchers using
rules based on gaze, head lane tracking data and by fusing
them together with support vector machine classification
technique. The Face Lab of Seeing Machines is applied to
obtain the eye features of the driver. Face Lab is a device to
calculate head rotation and analyses eye states (e.g.
saccades, eye blinking, eye gaze direction etc.). The
direction of the gaze and the positioning of the head
determine the driver’s attention when he is not looking at
the road. The output mapping also assists in indicating when
the driver is engaged in division of attention. And most
importantly it is able to generate the driver’s intent to
changes lanes. The Kernel function used for mapping and
the Support Vector Machine Algorithm to classify the
driver’s behavior as a consequence of better adaption to
timely changes.[2] The inference depicts that the eye gaze
and rotation of the head obtained is sufficient to be
employed in driver distraction system.
D. Eye Gesture Analysis
1) Head Rotation
While making an observation the driver rotates his or her
head around the focal point. Consider x axis to be the
horizontal coordinate with respect to y the vertical
coordinate and if the head is at origin, the turning towards
left or right side oscillates about the y axis. While driving
the routine rotations are limited to the x axis and y axis.[2]
Therefore the flowing extract will elucidate on x axis and y
axis rotations. To begin this process, it is essential to capture
eye-gesture templates of the driver in varied head poses and
to select the appropriate template set upon an estimate of the
rotation of the head followed by a calibration driver rotating
over her head to face each of the 24 windscreen cells and a
complete set of eye motion templates are taken for each
head pose. For instance, the calibration driver rotates over
her head facing a windscreen cell and eye-gesture templates
captured of the driver looking at all 24 windscreen cells
while keeping her head pose fixed. The 24 focal-points will
consists of be 24 unique sets of eye-gesture templates with
each set containing 24 templates. Under regular driving
conditions, the driver’s head rotation is estimated by the
methodology presented based upon estimating the 3D
rotation of the head from relative positions of facial feature
points in the 2D video image of the face capturing the
templates and closing on the closest match.
2) Forward and Backward Displacements
The driver’s seat can be adjusted and hence the distance and
the point of view for the driver can affect the accuracy of the
reading of the point of regard. The point of regard deviates
by (Δx, Δy) irrespective of the direction of movement i.e.
Forward or backward.[2] The change in distance Δd of the
driver’s head from the point-of-regard element with respect
to the calibration distance d and the viewing angle θ affects
the value of the Δx and Δy.The pink head representing the
calibration of the head position with d being the distance of
the pink head from the windscreen. The blue head represents
a backward displacement of Δd. The horizontal position x is
representing the distance of the point of regard from the
neutral, forward facing eye-gaze position and Δx represents
the change in the point of regard resulting from the
backward displacement of the driver’s head. The eye-gaze
viewing angle remains the same in both the pink head
position and the blue head position.
Fig. 2
3) Horizontal and vertical displacements
To explain the horizontal and vertical displacements of the
head from the standardized position, imagine that the
driver’s head is within a 2d mapping grid, figure 2 illustrates
an eye gesture mapping grid used to denote the mapping
between eye gesture templates and the point of regard
elements inside the vehicle as the driver changes his position
horizontally or vertically. In the figure a grid of 64 cells is
used and two drivers are shown. The pink driver is the
shorter of the two drivers. Her right is located in cell (4, 5).
If the pink driver is in the calibration position and her hear is
in a neutral position, forward posing side. The pink driver’s
left eye as she looks at the center of the point of regard
elements(windscreen cells) creates the Eye gesture
templates, there would be 24 corresponding eye templates
calibrated for the pink driver’s left eye located in cell (4,5)
in the forward facing head pose.
Fig. 2: Eye gesture mapping grid
Representation of a
backward
displacement of the
head (blue) from
the calibration head
position (pink)
Eye Gesture Analysis for Prevention of Road Accidents
(IJSRD/Vol. 1/Issue 7/2013/0018)
All rights reserved by www.ijsrd.com 1454
As long as the pink driver’s left eye and an eye gesture
template looking at a particular cell indicate that the she is
looking at that windscreen cell.
However, this is not the case for the blue driver who is
taller, positioned to the left of the pink driver. His left eye is
located within cell (2, 4). A bout between the blue driver’s
left eye and an eye gesture template looking at a particular
windscreen cell may not necessarily indicate that the blue
driver is actually looking at that windscreen cell due to the
fact that his left eye is displaced one cell to the left and two
cells above the calibration cell.[1]
If the driver’s left eye moves horizontally by ∆x and
vertically by ∆y from the calibration position then the
drivers point of regard on the windscreen or mirror will also
move by ∆x and ∆y provided the drivers head movement is
limited to the XY plane and the gaze direction remains
unchanged. The Ni-DASS system will measure ∆x and ∆y
in pixels in the video image from the on board CCD camera
observing the drivers face. It is necessary to determine the
scaling between the real world and the pixel displacements
within the video, In order to calculate the real world
displacement values, This can be achieved simply by
measuring the height of the calibration of the driver’s head
in the real world units and dividing by the length of the
calibration driver’s head in pixels within the video
III. IMPLEMENTATION
Once the eye gesture analysis is done (i.e.) once the
different states of the driver’s eye have been analyzed, the
driver’s attention has to be brought back towards driving to
prevent accidents. If it is found that the driver’s eyes are
closed for more than 3-4 seconds then an automatic alarm
will be initiated which will bring back the driver’s attention
towards driving.
Another idea is that, modifications can be made on the
mechanics of the car. A system can be designed in such a
way that if the alarm is initiated then automatic breaks come
into action which will cause the speed of the moving car to
be decelerated to an extent. The amount of speed to be
decelerated will be designed along with the control system.
IV. FEATURES
A very frequent reason is the features tend to encode adhoc
domain knowledge which is difficult to learn and requires
extensive quantity of training data. In this case there is also
another critical impetus for the features: being that feature
based operates faster than pixel based. The simple features
used are reminiscent of Haar basis functions which have
been used by Papageorgiou et al.
More explicitly, we may use three kinds of features.[1] The
value of a two-rectangle feature is the difference between
the sum of the pixels within two rectangular regions. The
sections have the same size and shape and are horizontally
or vertically adjacent (see Figure 1). A three-rectangle
feature computes the sum within two outside rectangles
subtracted from the sum in a centre rectangle.[1] Finally a
four-rectangle feature computes the difference between
diagonal pairs of rectangles.
Fig. 3 shows relative to the enclosing detection window. The
sums of the pixels which lie within the white rectangles are
subtracted from the sum of pixels in the grey rectangles.
Two-rectangle features are shown in (A) and (B). Figure (C)
shows a three-rectangle feature, and (D) a four-rectangle
feature.
Fig. 3: Example rectangle features
V. INTEGERAL IMAGE
Rectangle features can be computed very fast using an
intermediate representation for the image which is caledl the
integral image.The integral image at location x,y contains
the sum of the pixels above and to the left of x,y, inclusive:
Fig. 4
The sum of the pixels within rectangle D can be computed
with four array references. The value of the integral image at
location 1 is the sum of the pixels in rectangle A. The value
at location 2 is A + b, at location 3 is A+C, and at location
4 is A+B+C+D. the sum within D can be computed as 4+1-
(2+3).
VI. FACE DETECTION
The focus of the first feature seems to be on the provision
that the region around the eye is often darker than the other
facial fronts of nose and cheeks. This property is
prominently large in comparison to detection with sub
window and is slightly insensitive to the size or location of
the face.The second feature selected dwells on the property
that the eyes are darker than the bridge of the nose.
In fig. 3, the first and second features is selected by
AdaBoost.
The two features are shown in the top row and then overlaid
on a typical training face in the bottom row.[1] The first
feature measures the difference in intensity between the
region of the eyes and a region across the upper cheeks. The
feature capitalizes on the observation that the eye region is
often darker than the cheeks. The second feature compares
the intensities in the eye regions to the intensity across the
bridge of the nose.
Eye Gesture Analysis for Prevention of Road Accidents
(IJSRD/Vol. 1/Issue 7/2013/0018)
All rights reserved by www.ijsrd.com 1455
Fig. 5: The first and second features selected by AdaBoost
VII. CONCLUSION
This Intelligent Active System will effectively ensure a
decrease in number of road accidents on average. Many
other factors may also be taken into account like the speed
of the vehicle and distance maintenance and lane
management. However, eye-gesture analysis helps reduce
the major road accidents.[1] The results infer that this
approach might be best suited for point of regard situations
in which determination of high accuracy is not crucial. Such
situations would lead to hazard awareness systems where
the ADAS system would alert the driver if he or she is
failing to observe a hazard on the road by detecting
situations when the driver is positively not paying attention.
REFERENCES
[1] Viola Jones paper on Robust Real Time Face
Detection, International Journal of Computer Vision,
July 11 2003.
[2] Eye Gesture Analysis with Head Movement for
Advanced Driver Assistance Systems, Siti Nor
Hafizah bt Mohd Zaid, Mohamed Abdel Maguid,
Abdel Hamid Soliman, 2012.
[3] I.Ravyse , M.J.T.Reinders, J.Cornelis H.sahli , paper
on Eye Gesture Estimation, In Conference
Proceedings CD ROM of IEEE Benelux Signal
Processing Chapter, Signal Processing Symposium,
SPS2000, March 23-24, 2000, Hilvarenbeek, The
Netherlands
[4] Viola Jones paper on Fast Multi-view Face Detection,
Mitsubishi Electric Research Laboratories.

More Related Content

Similar to Eye Gesture Analysis Prevents Road Accidents

Real Time Driver Drowsiness Detection System for Road Safety
Real Time Driver Drowsiness Detection System for Road SafetyReal Time Driver Drowsiness Detection System for Road Safety
Real Time Driver Drowsiness Detection System for Road SafetyIRJET Journal
 
EYE GAZE ESTIMATION INVISIBLE AND IR SPECTRUM FOR DRIVER MONITORING SYSTEM
EYE GAZE ESTIMATION INVISIBLE AND IR SPECTRUM FOR DRIVER MONITORING SYSTEMEYE GAZE ESTIMATION INVISIBLE AND IR SPECTRUM FOR DRIVER MONITORING SYSTEM
EYE GAZE ESTIMATION INVISIBLE AND IR SPECTRUM FOR DRIVER MONITORING SYSTEMsipij
 
Real-Time Driver Drowsiness Detection System
Real-Time Driver Drowsiness Detection SystemReal-Time Driver Drowsiness Detection System
Real-Time Driver Drowsiness Detection SystemIRJET Journal
 
REAL-TIME DRIVER DROWSINESS DETECTION
REAL-TIME DRIVER DROWSINESS DETECTIONREAL-TIME DRIVER DROWSINESS DETECTION
REAL-TIME DRIVER DROWSINESS DETECTIONIRJET Journal
 
REAL-TIME DRIVER DROWSINESS DETECTION
REAL-TIME DRIVER DROWSINESS DETECTIONREAL-TIME DRIVER DROWSINESS DETECTION
REAL-TIME DRIVER DROWSINESS DETECTIONIRJET Journal
 
Driver Dormant Monitoring System to Avert Fatal Accidents Using Image Processing
Driver Dormant Monitoring System to Avert Fatal Accidents Using Image ProcessingDriver Dormant Monitoring System to Avert Fatal Accidents Using Image Processing
Driver Dormant Monitoring System to Avert Fatal Accidents Using Image ProcessingIRJET Journal
 
Driver Dormant Monitoring System to Avert Fatal Accidents Using Image Processing
Driver Dormant Monitoring System to Avert Fatal Accidents Using Image ProcessingDriver Dormant Monitoring System to Avert Fatal Accidents Using Image Processing
Driver Dormant Monitoring System to Avert Fatal Accidents Using Image ProcessingIRJET Journal
 
Realtime Car Driver Drowsiness Detection using Machine Learning Approach
Realtime Car Driver Drowsiness Detection using Machine Learning ApproachRealtime Car Driver Drowsiness Detection using Machine Learning Approach
Realtime Car Driver Drowsiness Detection using Machine Learning ApproachIRJET Journal
 
IRJET- Driver Alert System to Detect Drowsiness and Distraction
IRJET- Driver Alert System to Detect Drowsiness and DistractionIRJET- Driver Alert System to Detect Drowsiness and Distraction
IRJET- Driver Alert System to Detect Drowsiness and DistractionIRJET Journal
 
Towards a system for real-time prevention of drowsiness-related accidents
Towards a system for real-time prevention of drowsiness-related accidentsTowards a system for real-time prevention of drowsiness-related accidents
Towards a system for real-time prevention of drowsiness-related accidentsIAESIJAI
 
Driver Fatigue Monitoring System Using Eye Closure
Driver Fatigue Monitoring System Using Eye ClosureDriver Fatigue Monitoring System Using Eye Closure
Driver Fatigue Monitoring System Using Eye ClosureIJMER
 
Driver Fatigue Monitoring System Using Eye Closure
Driver Fatigue Monitoring System Using Eye ClosureDriver Fatigue Monitoring System Using Eye Closure
Driver Fatigue Monitoring System Using Eye ClosureIJMER
 
IRJET- Implementation on Visual Analysis of Eye State using Image Process...
IRJET-  	  Implementation on Visual Analysis of Eye State using Image Process...IRJET-  	  Implementation on Visual Analysis of Eye State using Image Process...
IRJET- Implementation on Visual Analysis of Eye State using Image Process...IRJET Journal
 
Driver Drowsiness and Alert System using Image Processing & IoT
Driver Drowsiness and Alert System using Image Processing & IoTDriver Drowsiness and Alert System using Image Processing & IoT
Driver Drowsiness and Alert System using Image Processing & IoTIRJET Journal
 
In advance accident alert system & Driver Drowsiness Detection
In advance accident alert system & Driver Drowsiness DetectionIn advance accident alert system & Driver Drowsiness Detection
In advance accident alert system & Driver Drowsiness DetectionIRJET Journal
 
IRJET- Vehicular Data Acquisition & Driver Behaviours Surveillance System
IRJET- Vehicular Data Acquisition & Driver Behaviours Surveillance SystemIRJET- Vehicular Data Acquisition & Driver Behaviours Surveillance System
IRJET- Vehicular Data Acquisition & Driver Behaviours Surveillance SystemIRJET Journal
 
Real time detecting driver’s drowsiness using computer vision
Real time detecting driver’s drowsiness using computer visionReal time detecting driver’s drowsiness using computer vision
Real time detecting driver’s drowsiness using computer visioneSAT Publishing House
 
Driver detection system_final.ppt
Driver detection system_final.pptDriver detection system_final.ppt
Driver detection system_final.pptMaseeraAhmed1
 

Similar to Eye Gesture Analysis Prevents Road Accidents (20)

Dr4301707711
Dr4301707711Dr4301707711
Dr4301707711
 
Real Time Driver Drowsiness Detection System for Road Safety
Real Time Driver Drowsiness Detection System for Road SafetyReal Time Driver Drowsiness Detection System for Road Safety
Real Time Driver Drowsiness Detection System for Road Safety
 
EYE GAZE ESTIMATION INVISIBLE AND IR SPECTRUM FOR DRIVER MONITORING SYSTEM
EYE GAZE ESTIMATION INVISIBLE AND IR SPECTRUM FOR DRIVER MONITORING SYSTEMEYE GAZE ESTIMATION INVISIBLE AND IR SPECTRUM FOR DRIVER MONITORING SYSTEM
EYE GAZE ESTIMATION INVISIBLE AND IR SPECTRUM FOR DRIVER MONITORING SYSTEM
 
Real-Time Driver Drowsiness Detection System
Real-Time Driver Drowsiness Detection SystemReal-Time Driver Drowsiness Detection System
Real-Time Driver Drowsiness Detection System
 
DRIVER DROWSINESS ALERT SYSTEM
DRIVER DROWSINESS ALERT SYSTEMDRIVER DROWSINESS ALERT SYSTEM
DRIVER DROWSINESS ALERT SYSTEM
 
REAL-TIME DRIVER DROWSINESS DETECTION
REAL-TIME DRIVER DROWSINESS DETECTIONREAL-TIME DRIVER DROWSINESS DETECTION
REAL-TIME DRIVER DROWSINESS DETECTION
 
REAL-TIME DRIVER DROWSINESS DETECTION
REAL-TIME DRIVER DROWSINESS DETECTIONREAL-TIME DRIVER DROWSINESS DETECTION
REAL-TIME DRIVER DROWSINESS DETECTION
 
Driver Dormant Monitoring System to Avert Fatal Accidents Using Image Processing
Driver Dormant Monitoring System to Avert Fatal Accidents Using Image ProcessingDriver Dormant Monitoring System to Avert Fatal Accidents Using Image Processing
Driver Dormant Monitoring System to Avert Fatal Accidents Using Image Processing
 
Driver Dormant Monitoring System to Avert Fatal Accidents Using Image Processing
Driver Dormant Monitoring System to Avert Fatal Accidents Using Image ProcessingDriver Dormant Monitoring System to Avert Fatal Accidents Using Image Processing
Driver Dormant Monitoring System to Avert Fatal Accidents Using Image Processing
 
Realtime Car Driver Drowsiness Detection using Machine Learning Approach
Realtime Car Driver Drowsiness Detection using Machine Learning ApproachRealtime Car Driver Drowsiness Detection using Machine Learning Approach
Realtime Car Driver Drowsiness Detection using Machine Learning Approach
 
IRJET- Driver Alert System to Detect Drowsiness and Distraction
IRJET- Driver Alert System to Detect Drowsiness and DistractionIRJET- Driver Alert System to Detect Drowsiness and Distraction
IRJET- Driver Alert System to Detect Drowsiness and Distraction
 
Towards a system for real-time prevention of drowsiness-related accidents
Towards a system for real-time prevention of drowsiness-related accidentsTowards a system for real-time prevention of drowsiness-related accidents
Towards a system for real-time prevention of drowsiness-related accidents
 
Driver Fatigue Monitoring System Using Eye Closure
Driver Fatigue Monitoring System Using Eye ClosureDriver Fatigue Monitoring System Using Eye Closure
Driver Fatigue Monitoring System Using Eye Closure
 
Driver Fatigue Monitoring System Using Eye Closure
Driver Fatigue Monitoring System Using Eye ClosureDriver Fatigue Monitoring System Using Eye Closure
Driver Fatigue Monitoring System Using Eye Closure
 
IRJET- Implementation on Visual Analysis of Eye State using Image Process...
IRJET-  	  Implementation on Visual Analysis of Eye State using Image Process...IRJET-  	  Implementation on Visual Analysis of Eye State using Image Process...
IRJET- Implementation on Visual Analysis of Eye State using Image Process...
 
Driver Drowsiness and Alert System using Image Processing & IoT
Driver Drowsiness and Alert System using Image Processing & IoTDriver Drowsiness and Alert System using Image Processing & IoT
Driver Drowsiness and Alert System using Image Processing & IoT
 
In advance accident alert system & Driver Drowsiness Detection
In advance accident alert system & Driver Drowsiness DetectionIn advance accident alert system & Driver Drowsiness Detection
In advance accident alert system & Driver Drowsiness Detection
 
IRJET- Vehicular Data Acquisition & Driver Behaviours Surveillance System
IRJET- Vehicular Data Acquisition & Driver Behaviours Surveillance SystemIRJET- Vehicular Data Acquisition & Driver Behaviours Surveillance System
IRJET- Vehicular Data Acquisition & Driver Behaviours Surveillance System
 
Real time detecting driver’s drowsiness using computer vision
Real time detecting driver’s drowsiness using computer visionReal time detecting driver’s drowsiness using computer vision
Real time detecting driver’s drowsiness using computer vision
 
Driver detection system_final.ppt
Driver detection system_final.pptDriver detection system_final.ppt
Driver detection system_final.ppt
 

More from ijsrd.com

IoT Enabled Smart Grid
IoT Enabled Smart GridIoT Enabled Smart Grid
IoT Enabled Smart Gridijsrd.com
 
A Survey Report on : Security & Challenges in Internet of Things
A Survey Report on : Security & Challenges in Internet of ThingsA Survey Report on : Security & Challenges in Internet of Things
A Survey Report on : Security & Challenges in Internet of Thingsijsrd.com
 
IoT for Everyday Life
IoT for Everyday LifeIoT for Everyday Life
IoT for Everyday Lifeijsrd.com
 
Study on Issues in Managing and Protecting Data of IOT
Study on Issues in Managing and Protecting Data of IOTStudy on Issues in Managing and Protecting Data of IOT
Study on Issues in Managing and Protecting Data of IOTijsrd.com
 
Interactive Technologies for Improving Quality of Education to Build Collabor...
Interactive Technologies for Improving Quality of Education to Build Collabor...Interactive Technologies for Improving Quality of Education to Build Collabor...
Interactive Technologies for Improving Quality of Education to Build Collabor...ijsrd.com
 
Internet of Things - Paradigm Shift of Future Internet Application for Specia...
Internet of Things - Paradigm Shift of Future Internet Application for Specia...Internet of Things - Paradigm Shift of Future Internet Application for Specia...
Internet of Things - Paradigm Shift of Future Internet Application for Specia...ijsrd.com
 
A Study of the Adverse Effects of IoT on Student's Life
A Study of the Adverse Effects of IoT on Student's LifeA Study of the Adverse Effects of IoT on Student's Life
A Study of the Adverse Effects of IoT on Student's Lifeijsrd.com
 
Pedagogy for Effective use of ICT in English Language Learning
Pedagogy for Effective use of ICT in English Language LearningPedagogy for Effective use of ICT in English Language Learning
Pedagogy for Effective use of ICT in English Language Learningijsrd.com
 
Virtual Eye - Smart Traffic Navigation System
Virtual Eye - Smart Traffic Navigation SystemVirtual Eye - Smart Traffic Navigation System
Virtual Eye - Smart Traffic Navigation Systemijsrd.com
 
Ontological Model of Educational Programs in Computer Science (Bachelor and M...
Ontological Model of Educational Programs in Computer Science (Bachelor and M...Ontological Model of Educational Programs in Computer Science (Bachelor and M...
Ontological Model of Educational Programs in Computer Science (Bachelor and M...ijsrd.com
 
Understanding IoT Management for Smart Refrigerator
Understanding IoT Management for Smart RefrigeratorUnderstanding IoT Management for Smart Refrigerator
Understanding IoT Management for Smart Refrigeratorijsrd.com
 
DESIGN AND ANALYSIS OF DOUBLE WISHBONE SUSPENSION SYSTEM USING FINITE ELEMENT...
DESIGN AND ANALYSIS OF DOUBLE WISHBONE SUSPENSION SYSTEM USING FINITE ELEMENT...DESIGN AND ANALYSIS OF DOUBLE WISHBONE SUSPENSION SYSTEM USING FINITE ELEMENT...
DESIGN AND ANALYSIS OF DOUBLE WISHBONE SUSPENSION SYSTEM USING FINITE ELEMENT...ijsrd.com
 
A Review: Microwave Energy for materials processing
A Review: Microwave Energy for materials processingA Review: Microwave Energy for materials processing
A Review: Microwave Energy for materials processingijsrd.com
 
Web Usage Mining: A Survey on User's Navigation Pattern from Web Logs
Web Usage Mining: A Survey on User's Navigation Pattern from Web LogsWeb Usage Mining: A Survey on User's Navigation Pattern from Web Logs
Web Usage Mining: A Survey on User's Navigation Pattern from Web Logsijsrd.com
 
APPLICATION OF STATCOM to IMPROVED DYNAMIC PERFORMANCE OF POWER SYSTEM
APPLICATION OF STATCOM to IMPROVED DYNAMIC PERFORMANCE OF POWER SYSTEMAPPLICATION OF STATCOM to IMPROVED DYNAMIC PERFORMANCE OF POWER SYSTEM
APPLICATION OF STATCOM to IMPROVED DYNAMIC PERFORMANCE OF POWER SYSTEMijsrd.com
 
Making model of dual axis solar tracking with Maximum Power Point Tracking
Making model of dual axis solar tracking with Maximum Power Point TrackingMaking model of dual axis solar tracking with Maximum Power Point Tracking
Making model of dual axis solar tracking with Maximum Power Point Trackingijsrd.com
 
A REVIEW PAPER ON PERFORMANCE AND EMISSION TEST OF 4 STROKE DIESEL ENGINE USI...
A REVIEW PAPER ON PERFORMANCE AND EMISSION TEST OF 4 STROKE DIESEL ENGINE USI...A REVIEW PAPER ON PERFORMANCE AND EMISSION TEST OF 4 STROKE DIESEL ENGINE USI...
A REVIEW PAPER ON PERFORMANCE AND EMISSION TEST OF 4 STROKE DIESEL ENGINE USI...ijsrd.com
 
Study and Review on Various Current Comparators
Study and Review on Various Current ComparatorsStudy and Review on Various Current Comparators
Study and Review on Various Current Comparatorsijsrd.com
 
Reducing Silicon Real Estate and Switching Activity Using Low Power Test Patt...
Reducing Silicon Real Estate and Switching Activity Using Low Power Test Patt...Reducing Silicon Real Estate and Switching Activity Using Low Power Test Patt...
Reducing Silicon Real Estate and Switching Activity Using Low Power Test Patt...ijsrd.com
 
Defending Reactive Jammers in WSN using a Trigger Identification Service.
Defending Reactive Jammers in WSN using a Trigger Identification Service.Defending Reactive Jammers in WSN using a Trigger Identification Service.
Defending Reactive Jammers in WSN using a Trigger Identification Service.ijsrd.com
 

More from ijsrd.com (20)

IoT Enabled Smart Grid
IoT Enabled Smart GridIoT Enabled Smart Grid
IoT Enabled Smart Grid
 
A Survey Report on : Security & Challenges in Internet of Things
A Survey Report on : Security & Challenges in Internet of ThingsA Survey Report on : Security & Challenges in Internet of Things
A Survey Report on : Security & Challenges in Internet of Things
 
IoT for Everyday Life
IoT for Everyday LifeIoT for Everyday Life
IoT for Everyday Life
 
Study on Issues in Managing and Protecting Data of IOT
Study on Issues in Managing and Protecting Data of IOTStudy on Issues in Managing and Protecting Data of IOT
Study on Issues in Managing and Protecting Data of IOT
 
Interactive Technologies for Improving Quality of Education to Build Collabor...
Interactive Technologies for Improving Quality of Education to Build Collabor...Interactive Technologies for Improving Quality of Education to Build Collabor...
Interactive Technologies for Improving Quality of Education to Build Collabor...
 
Internet of Things - Paradigm Shift of Future Internet Application for Specia...
Internet of Things - Paradigm Shift of Future Internet Application for Specia...Internet of Things - Paradigm Shift of Future Internet Application for Specia...
Internet of Things - Paradigm Shift of Future Internet Application for Specia...
 
A Study of the Adverse Effects of IoT on Student's Life
A Study of the Adverse Effects of IoT on Student's LifeA Study of the Adverse Effects of IoT on Student's Life
A Study of the Adverse Effects of IoT on Student's Life
 
Pedagogy for Effective use of ICT in English Language Learning
Pedagogy for Effective use of ICT in English Language LearningPedagogy for Effective use of ICT in English Language Learning
Pedagogy for Effective use of ICT in English Language Learning
 
Virtual Eye - Smart Traffic Navigation System
Virtual Eye - Smart Traffic Navigation SystemVirtual Eye - Smart Traffic Navigation System
Virtual Eye - Smart Traffic Navigation System
 
Ontological Model of Educational Programs in Computer Science (Bachelor and M...
Ontological Model of Educational Programs in Computer Science (Bachelor and M...Ontological Model of Educational Programs in Computer Science (Bachelor and M...
Ontological Model of Educational Programs in Computer Science (Bachelor and M...
 
Understanding IoT Management for Smart Refrigerator
Understanding IoT Management for Smart RefrigeratorUnderstanding IoT Management for Smart Refrigerator
Understanding IoT Management for Smart Refrigerator
 
DESIGN AND ANALYSIS OF DOUBLE WISHBONE SUSPENSION SYSTEM USING FINITE ELEMENT...
DESIGN AND ANALYSIS OF DOUBLE WISHBONE SUSPENSION SYSTEM USING FINITE ELEMENT...DESIGN AND ANALYSIS OF DOUBLE WISHBONE SUSPENSION SYSTEM USING FINITE ELEMENT...
DESIGN AND ANALYSIS OF DOUBLE WISHBONE SUSPENSION SYSTEM USING FINITE ELEMENT...
 
A Review: Microwave Energy for materials processing
A Review: Microwave Energy for materials processingA Review: Microwave Energy for materials processing
A Review: Microwave Energy for materials processing
 
Web Usage Mining: A Survey on User's Navigation Pattern from Web Logs
Web Usage Mining: A Survey on User's Navigation Pattern from Web LogsWeb Usage Mining: A Survey on User's Navigation Pattern from Web Logs
Web Usage Mining: A Survey on User's Navigation Pattern from Web Logs
 
APPLICATION OF STATCOM to IMPROVED DYNAMIC PERFORMANCE OF POWER SYSTEM
APPLICATION OF STATCOM to IMPROVED DYNAMIC PERFORMANCE OF POWER SYSTEMAPPLICATION OF STATCOM to IMPROVED DYNAMIC PERFORMANCE OF POWER SYSTEM
APPLICATION OF STATCOM to IMPROVED DYNAMIC PERFORMANCE OF POWER SYSTEM
 
Making model of dual axis solar tracking with Maximum Power Point Tracking
Making model of dual axis solar tracking with Maximum Power Point TrackingMaking model of dual axis solar tracking with Maximum Power Point Tracking
Making model of dual axis solar tracking with Maximum Power Point Tracking
 
A REVIEW PAPER ON PERFORMANCE AND EMISSION TEST OF 4 STROKE DIESEL ENGINE USI...
A REVIEW PAPER ON PERFORMANCE AND EMISSION TEST OF 4 STROKE DIESEL ENGINE USI...A REVIEW PAPER ON PERFORMANCE AND EMISSION TEST OF 4 STROKE DIESEL ENGINE USI...
A REVIEW PAPER ON PERFORMANCE AND EMISSION TEST OF 4 STROKE DIESEL ENGINE USI...
 
Study and Review on Various Current Comparators
Study and Review on Various Current ComparatorsStudy and Review on Various Current Comparators
Study and Review on Various Current Comparators
 
Reducing Silicon Real Estate and Switching Activity Using Low Power Test Patt...
Reducing Silicon Real Estate and Switching Activity Using Low Power Test Patt...Reducing Silicon Real Estate and Switching Activity Using Low Power Test Patt...
Reducing Silicon Real Estate and Switching Activity Using Low Power Test Patt...
 
Defending Reactive Jammers in WSN using a Trigger Identification Service.
Defending Reactive Jammers in WSN using a Trigger Identification Service.Defending Reactive Jammers in WSN using a Trigger Identification Service.
Defending Reactive Jammers in WSN using a Trigger Identification Service.
 

Recently uploaded

OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...Soham Mondal
 
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICS
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICSHARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICS
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICSRajkumarAkumalla
 
247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt
247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt
247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).pptssuser5c9d4b1
 
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete RecordCCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete RecordAsst.prof M.Gokilavani
 
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escortsranjana rawat
 
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur High Profile
 
AKTU Computer Networks notes --- Unit 3.pdf
AKTU Computer Networks notes ---  Unit 3.pdfAKTU Computer Networks notes ---  Unit 3.pdf
AKTU Computer Networks notes --- Unit 3.pdfankushspencer015
 
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...roncy bisnoi
 
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur EscortsCall Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur High Profile
 
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur EscortsHigh Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escortsranjana rawat
 
Porous Ceramics seminar and technical writing
Porous Ceramics seminar and technical writingPorous Ceramics seminar and technical writing
Porous Ceramics seminar and technical writingrakeshbaidya232001
 
Processing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptxProcessing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptxpranjaldaimarysona
 
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...Christo Ananth
 
Introduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptxIntroduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptxupamatechverse
 
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICSAPPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICSKurinjimalarL3
 
KubeKraft presentation @CloudNativeHooghly
KubeKraft presentation @CloudNativeHooghlyKubeKraft presentation @CloudNativeHooghly
KubeKraft presentation @CloudNativeHooghlysanyuktamishra911
 
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...ranjana rawat
 
Coefficient of Thermal Expansion and their Importance.pptx
Coefficient of Thermal Expansion and their Importance.pptxCoefficient of Thermal Expansion and their Importance.pptx
Coefficient of Thermal Expansion and their Importance.pptxAsutosh Ranjan
 

Recently uploaded (20)

OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
 
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICS
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICSHARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICS
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICS
 
247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt
247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt
247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt
 
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete RecordCCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
 
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
 
Water Industry Process Automation & Control Monthly - April 2024
Water Industry Process Automation & Control Monthly - April 2024Water Industry Process Automation & Control Monthly - April 2024
Water Industry Process Automation & Control Monthly - April 2024
 
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
 
AKTU Computer Networks notes --- Unit 3.pdf
AKTU Computer Networks notes ---  Unit 3.pdfAKTU Computer Networks notes ---  Unit 3.pdf
AKTU Computer Networks notes --- Unit 3.pdf
 
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...
 
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur EscortsCall Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
 
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur EscortsHigh Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
 
Porous Ceramics seminar and technical writing
Porous Ceramics seminar and technical writingPorous Ceramics seminar and technical writing
Porous Ceramics seminar and technical writing
 
Roadmap to Membership of RICS - Pathways and Routes
Roadmap to Membership of RICS - Pathways and RoutesRoadmap to Membership of RICS - Pathways and Routes
Roadmap to Membership of RICS - Pathways and Routes
 
Processing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptxProcessing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptx
 
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
 
Introduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptxIntroduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptx
 
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICSAPPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
 
KubeKraft presentation @CloudNativeHooghly
KubeKraft presentation @CloudNativeHooghlyKubeKraft presentation @CloudNativeHooghly
KubeKraft presentation @CloudNativeHooghly
 
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
 
Coefficient of Thermal Expansion and their Importance.pptx
Coefficient of Thermal Expansion and their Importance.pptxCoefficient of Thermal Expansion and their Importance.pptx
Coefficient of Thermal Expansion and their Importance.pptx
 

Eye Gesture Analysis Prevents Road Accidents

  • 1. IJSRD - International Journal for Scientific Research & Development| Vol. 1, Issue 7, 2013 | ISSN (online): 2321-0613 All rights reserved by www.ijsrd.com 1452 Abstract— Around the globe, death is a daily occurrence mainly due to accidents. Research has been conducted intensively to attempt reduce accidents and extemporize the Driver Assistance System. The core idea for this paper is depicted through a process evolved to enhance effectively the Intelligent Driver assistance system and also a safety system to access the driver’s perspectives with vehicles. The system uses a dynamic CCD camera in the vehicle that observes the driver’s face. A prototype to match the approach is used to compare the Driver’s eye pattern with a set of existing templates of the driver gazing at various focal points inside the vehicle. The windscreen is further divided into segments and a comparison of the driver’s eye gaze pattern with the existing stencil determines the driver’s view point on the windscreen. For instance, in case the driver is detected to be drowsy with closed eyelids for more than a few seconds then he will be alerted automatically. I. Introduction The increasing number of accidents is proportional to the driver’s take on the attributed monitoring. This corresponds to the Vigilance level. The lower the monitoring the greater the shift from healthy is driving to hazardous outcomes. This vigilance level affects the perceptions, attitudes and control abilities thereby posing a great threat to themselves and the people around them. Hence, developing a module to constantly watch the driver and incessantly alert him in case of any breach will effectively reduce the number of accidents. In correspondence to this regard, voluminous efforts have been made to developing an active system that ensures safety and effectively reduce the accident levels. The classification for stupor in drivers is as follows:- 1 Sensing of physiological characteristics. 2 sensing of driver operation 3 Sensing of vehicle response. 4 Monitoring the response of driver Accidents or Road hassles are a major problem in this fast pacing world of today. The Global Status Report on Road safety published in 2013, a statistics from 182 countries , accounting for 99% of the world population indicates that total number of deaths in accidents remain as high as 1.24 million per year. Only 7 % of the total population has comprehensive and effective road safety rules enforced. Vehicle congestion, road infrastructure and social interaction while driving leads to traffic around the globe. And unfortunately humans tend to be the main source and resource for road accidents. 99% of the accidents occur due to err of the common man. The first three categories consist of both passive and active components. Behavior alteration or attitude amendment is due to information, education, driving instruction and enforcement in the dominion of active safety. The marginal cause can also pertain to government’s enforcement that do not last long. This paper elucidates on the fact that eye-gesture templates matched to the driver’s eye to determine the driver’s viewpoint on the windscreen. Research indicates that gesture recognition techniques can in fact determine the perspective as of the driver with accuracy that suffices to prevent hazardous events. II. EYE GESTURES A. Drowsiness Visual cues like yawn frequency, blinking of the eye, gaze rate, head movements and nonverbal expressions aids detect drowsiness in the driver. Integral aspects are considered to make an efficient observation and stimulated response based on the integral image, Ada Boost technique and cascade classifier.[2] Followed by using the region of interest on the face thereby minimizing the search area for finding the eyes. Anthropometric property techniques are implied to detect the eyes and the exact position is detected by collaborating information from the grey level pixels. This is followed by a mathematical equation to generate random samples of the eyes indicating the exact state of the eye ( whether opened or closed). In order to identify whether they eyes being closed or opened the classification technique (support Vector Machine) is used. The drowsiness is detected bu the percentage of eye closed and this eye position generated helps determine the driver’s face orientation. Thereby, initiating an alert if the driver’s eyes are closed Fig. 1: classification technique B. Driver Attention Facial features and eyelid movements are the two most important factors used to asses poor driver behavior. A face model that is anthropometrical is used to identify prominent facial features. The image is made robust using few image processing techniques. Effectively to identify facial features from different angles of head pose the Kalman filter is used, which is detailed to have a technique that detects the face by the skin pixels on the image. The color skin histogram technique that used to obtain samples from various scenarios chooses the skin region manually and computes a normalized skin color histogram to validate the RGB-space. Eye Gesture Analysis for Prevention of Road Accidents Nitin Kumar Vashisth1 Md. Zubair. I2 1,2 Computer Science and Engineering 1,2 KCG College of Technology
  • 2. Eye Gesture Analysis for Prevention of Road Accidents (IJSRD/Vol. 1/Issue 7/2013/0018) All rights reserved by www.ijsrd.com 1453 Information such as percentage and duration of the eye closure, nodding and blinking frequency, face orientation and eye gaze is obtained in order to measure the driver drowsiness from PERCLOS.[2] The device is developed by Seeing Machine which is comprised three drowsiness metrics. They are proportion of time the eyes were closed at least 70%; the time the eyes were closed at least 80% and EYEMEAS (EM) where mean square percentage of eye closure rating measured. Using a fuzzy classifier the parameters are fused together to segregate the level of driver’s negligence plus to increase the performance of detection. C. Driver Distraction Driver distraction occurs due to various reasons during the driving period. It can be either picking up the phone, texting or other distractions like music, PDA devices etc. A few techniques have been employed to assure total attention of the driver and indicate distraction. This information is composed based on eye gaze, lips detection, head position etc. Driver’s distraction was deduced by researchers using rules based on gaze, head lane tracking data and by fusing them together with support vector machine classification technique. The Face Lab of Seeing Machines is applied to obtain the eye features of the driver. Face Lab is a device to calculate head rotation and analyses eye states (e.g. saccades, eye blinking, eye gaze direction etc.). The direction of the gaze and the positioning of the head determine the driver’s attention when he is not looking at the road. The output mapping also assists in indicating when the driver is engaged in division of attention. And most importantly it is able to generate the driver’s intent to changes lanes. The Kernel function used for mapping and the Support Vector Machine Algorithm to classify the driver’s behavior as a consequence of better adaption to timely changes.[2] The inference depicts that the eye gaze and rotation of the head obtained is sufficient to be employed in driver distraction system. D. Eye Gesture Analysis 1) Head Rotation While making an observation the driver rotates his or her head around the focal point. Consider x axis to be the horizontal coordinate with respect to y the vertical coordinate and if the head is at origin, the turning towards left or right side oscillates about the y axis. While driving the routine rotations are limited to the x axis and y axis.[2] Therefore the flowing extract will elucidate on x axis and y axis rotations. To begin this process, it is essential to capture eye-gesture templates of the driver in varied head poses and to select the appropriate template set upon an estimate of the rotation of the head followed by a calibration driver rotating over her head to face each of the 24 windscreen cells and a complete set of eye motion templates are taken for each head pose. For instance, the calibration driver rotates over her head facing a windscreen cell and eye-gesture templates captured of the driver looking at all 24 windscreen cells while keeping her head pose fixed. The 24 focal-points will consists of be 24 unique sets of eye-gesture templates with each set containing 24 templates. Under regular driving conditions, the driver’s head rotation is estimated by the methodology presented based upon estimating the 3D rotation of the head from relative positions of facial feature points in the 2D video image of the face capturing the templates and closing on the closest match. 2) Forward and Backward Displacements The driver’s seat can be adjusted and hence the distance and the point of view for the driver can affect the accuracy of the reading of the point of regard. The point of regard deviates by (Δx, Δy) irrespective of the direction of movement i.e. Forward or backward.[2] The change in distance Δd of the driver’s head from the point-of-regard element with respect to the calibration distance d and the viewing angle θ affects the value of the Δx and Δy.The pink head representing the calibration of the head position with d being the distance of the pink head from the windscreen. The blue head represents a backward displacement of Δd. The horizontal position x is representing the distance of the point of regard from the neutral, forward facing eye-gaze position and Δx represents the change in the point of regard resulting from the backward displacement of the driver’s head. The eye-gaze viewing angle remains the same in both the pink head position and the blue head position. Fig. 2 3) Horizontal and vertical displacements To explain the horizontal and vertical displacements of the head from the standardized position, imagine that the driver’s head is within a 2d mapping grid, figure 2 illustrates an eye gesture mapping grid used to denote the mapping between eye gesture templates and the point of regard elements inside the vehicle as the driver changes his position horizontally or vertically. In the figure a grid of 64 cells is used and two drivers are shown. The pink driver is the shorter of the two drivers. Her right is located in cell (4, 5). If the pink driver is in the calibration position and her hear is in a neutral position, forward posing side. The pink driver’s left eye as she looks at the center of the point of regard elements(windscreen cells) creates the Eye gesture templates, there would be 24 corresponding eye templates calibrated for the pink driver’s left eye located in cell (4,5) in the forward facing head pose. Fig. 2: Eye gesture mapping grid Representation of a backward displacement of the head (blue) from the calibration head position (pink)
  • 3. Eye Gesture Analysis for Prevention of Road Accidents (IJSRD/Vol. 1/Issue 7/2013/0018) All rights reserved by www.ijsrd.com 1454 As long as the pink driver’s left eye and an eye gesture template looking at a particular cell indicate that the she is looking at that windscreen cell. However, this is not the case for the blue driver who is taller, positioned to the left of the pink driver. His left eye is located within cell (2, 4). A bout between the blue driver’s left eye and an eye gesture template looking at a particular windscreen cell may not necessarily indicate that the blue driver is actually looking at that windscreen cell due to the fact that his left eye is displaced one cell to the left and two cells above the calibration cell.[1] If the driver’s left eye moves horizontally by ∆x and vertically by ∆y from the calibration position then the drivers point of regard on the windscreen or mirror will also move by ∆x and ∆y provided the drivers head movement is limited to the XY plane and the gaze direction remains unchanged. The Ni-DASS system will measure ∆x and ∆y in pixels in the video image from the on board CCD camera observing the drivers face. It is necessary to determine the scaling between the real world and the pixel displacements within the video, In order to calculate the real world displacement values, This can be achieved simply by measuring the height of the calibration of the driver’s head in the real world units and dividing by the length of the calibration driver’s head in pixels within the video III. IMPLEMENTATION Once the eye gesture analysis is done (i.e.) once the different states of the driver’s eye have been analyzed, the driver’s attention has to be brought back towards driving to prevent accidents. If it is found that the driver’s eyes are closed for more than 3-4 seconds then an automatic alarm will be initiated which will bring back the driver’s attention towards driving. Another idea is that, modifications can be made on the mechanics of the car. A system can be designed in such a way that if the alarm is initiated then automatic breaks come into action which will cause the speed of the moving car to be decelerated to an extent. The amount of speed to be decelerated will be designed along with the control system. IV. FEATURES A very frequent reason is the features tend to encode adhoc domain knowledge which is difficult to learn and requires extensive quantity of training data. In this case there is also another critical impetus for the features: being that feature based operates faster than pixel based. The simple features used are reminiscent of Haar basis functions which have been used by Papageorgiou et al. More explicitly, we may use three kinds of features.[1] The value of a two-rectangle feature is the difference between the sum of the pixels within two rectangular regions. The sections have the same size and shape and are horizontally or vertically adjacent (see Figure 1). A three-rectangle feature computes the sum within two outside rectangles subtracted from the sum in a centre rectangle.[1] Finally a four-rectangle feature computes the difference between diagonal pairs of rectangles. Fig. 3 shows relative to the enclosing detection window. The sums of the pixels which lie within the white rectangles are subtracted from the sum of pixels in the grey rectangles. Two-rectangle features are shown in (A) and (B). Figure (C) shows a three-rectangle feature, and (D) a four-rectangle feature. Fig. 3: Example rectangle features V. INTEGERAL IMAGE Rectangle features can be computed very fast using an intermediate representation for the image which is caledl the integral image.The integral image at location x,y contains the sum of the pixels above and to the left of x,y, inclusive: Fig. 4 The sum of the pixels within rectangle D can be computed with four array references. The value of the integral image at location 1 is the sum of the pixels in rectangle A. The value at location 2 is A + b, at location 3 is A+C, and at location 4 is A+B+C+D. the sum within D can be computed as 4+1- (2+3). VI. FACE DETECTION The focus of the first feature seems to be on the provision that the region around the eye is often darker than the other facial fronts of nose and cheeks. This property is prominently large in comparison to detection with sub window and is slightly insensitive to the size or location of the face.The second feature selected dwells on the property that the eyes are darker than the bridge of the nose. In fig. 3, the first and second features is selected by AdaBoost. The two features are shown in the top row and then overlaid on a typical training face in the bottom row.[1] The first feature measures the difference in intensity between the region of the eyes and a region across the upper cheeks. The feature capitalizes on the observation that the eye region is often darker than the cheeks. The second feature compares the intensities in the eye regions to the intensity across the bridge of the nose.
  • 4. Eye Gesture Analysis for Prevention of Road Accidents (IJSRD/Vol. 1/Issue 7/2013/0018) All rights reserved by www.ijsrd.com 1455 Fig. 5: The first and second features selected by AdaBoost VII. CONCLUSION This Intelligent Active System will effectively ensure a decrease in number of road accidents on average. Many other factors may also be taken into account like the speed of the vehicle and distance maintenance and lane management. However, eye-gesture analysis helps reduce the major road accidents.[1] The results infer that this approach might be best suited for point of regard situations in which determination of high accuracy is not crucial. Such situations would lead to hazard awareness systems where the ADAS system would alert the driver if he or she is failing to observe a hazard on the road by detecting situations when the driver is positively not paying attention. REFERENCES [1] Viola Jones paper on Robust Real Time Face Detection, International Journal of Computer Vision, July 11 2003. [2] Eye Gesture Analysis with Head Movement for Advanced Driver Assistance Systems, Siti Nor Hafizah bt Mohd Zaid, Mohamed Abdel Maguid, Abdel Hamid Soliman, 2012. [3] I.Ravyse , M.J.T.Reinders, J.Cornelis H.sahli , paper on Eye Gesture Estimation, In Conference Proceedings CD ROM of IEEE Benelux Signal Processing Chapter, Signal Processing Symposium, SPS2000, March 23-24, 2000, Hilvarenbeek, The Netherlands [4] Viola Jones paper on Fast Multi-view Face Detection, Mitsubishi Electric Research Laboratories.