Imagine yourself being a intelligent, motivated, and working person in the fiercely competitive
market of information technology, but just one problem You can't use your hands. Or you can't speak.
How do you do your job? How do you stay employed? You can, because of a very good gift from
computer Industry. The Eyegaze, a communication & control system you run with your eyes In humans,
gaze direction and ocular behaviour is probably one of the first distant means of communication
developed. Parents often try to understand what their baby looks at, and they deduce that the object
observed attracts his/her interest. This ability to interact with someone by a transitional object is named
joint attention
The Eyegaze System is a direct-select vision-controlled communication and control system. It was
developed in Fairfax, Virginia, by LC Technologies,
EYE GAZE COMMUNICATION SYSTEM
The Eye gaze System is a communication system for people with complex physical disabilities.
This operates with eyes by looking at control keys displayed on a screen.
Imagine yourself being a intelligent, motivated, and working person in the fiercely competitive
market of information technology, but just one problem You can't use your hands. Or you can't speak.
How do you do your job? How do you stay employed? You can, because of a very good gift from
computer Industry. The Eyegaze, a communication & control system you run with your eyes In humans,
gaze direction and ocular behaviour is probably one of the first distant means of communication
developed. Parents often try to understand what their baby looks at, and they deduce that the object
observed attracts his/her interest. This ability to interact with someone by a transitional object is named
joint attention
The Eyegaze System is a direct-select vision-controlled communication and control system. It was
developed in Fairfax, Virginia, by LC Technologies.
Abstract/Introduction/Users of Eye gaze/Skills required by the users/ Hardware Parts/Working//Menu's of Eye Gaze Communication System/Applications/Limitations/New Portable Eye Gaze Communication /Environment require for eye Gaze Communication/The eye gaze communication system components & prices/ Conclusion/ Thank you
EYE GAZE COMMUNICATION SYSTEM
The Eye gaze System is a communication system for people with complex physical disabilities.
This operates with eyes by looking at control keys displayed on a screen.
Imagine yourself being a intelligent, motivated, and working person in the fiercely competitive
market of information technology, but just one problem You can't use your hands. Or you can't speak.
How do you do your job? How do you stay employed? You can, because of a very good gift from
computer Industry. The Eyegaze, a communication & control system you run with your eyes In humans,
gaze direction and ocular behaviour is probably one of the first distant means of communication
developed. Parents often try to understand what their baby looks at, and they deduce that the object
observed attracts his/her interest. This ability to interact with someone by a transitional object is named
joint attention
The Eyegaze System is a direct-select vision-controlled communication and control system. It was
developed in Fairfax, Virginia, by LC Technologies.
Abstract/Introduction/Users of Eye gaze/Skills required by the users/ Hardware Parts/Working//Menu's of Eye Gaze Communication System/Applications/Limitations/New Portable Eye Gaze Communication /Environment require for eye Gaze Communication/The eye gaze communication system components & prices/ Conclusion/ Thank you
Bionic eye is an Artificial electronic eye.The electronic device which replaces functionality of a part or whole of the eye.The main purpose of bionic eye is to provide vision.The implant is a small chip that is surgically implanted behind the retina in the eye ball.There are two basic methods of bionic eyeMARC( Multiple Unit Artificial Retina Chip )ASR ( Artificial Silicon Retina system)
IT IS THE NEW SIDE OF LIFE SCIENCE WHICH CAN IMPROVE THE FUNCTIONALITY OF DAMAGE PART OF THE EYE , IN THIS PPT VARIOUS TECHNIQUES ARE DESCRIBED ON BIONIC EYE.
Eye Movement based Human Computer Interaction TechniqueJobin George
Eye movement-based interaction is one of several areas of current research in human computer interaction in which a new interface style seems to be emerging. In the non-command style, the computer passively monitors the user and responds as appropriate, rather than waiting for the user to issue specific commands. In describing eye movement-based human-computer interaction we can see two distinctions, one is in the nature of the user’s eye movements and the other, in the nature of the responses. In the world created by an eye movement based interface, users could move their eyes to scan the scene, just as they would a real world scene, unaffected by the presence of eye tracking equipment movement, on the eye movement axis. The alternative is to instruct users of the eye movement based interface to move their eyes in particular ways. On the response axis, objects could respond to a user’s eye movements in a natural way that is, the object responds to the user’s looking in the same way real objects do. The alternative is unnatural response, where objects respond in ways not experienced in the real world.
Now a days Eye tracking technology is applied in many fields like automotive defense and medical industries. The fields of advertising, entertainment, packaging and web design have all benefited significantly from studying the visual behavior of the consumer. Every day, as eye tracking is used in creative new ways, the list of applications grows.
An ocular prosthesis or artificial eye is a type of craniofacial prosthesis that replaces an absent eye following an enuleatin, evisceration, or orbital exenteration.
There are many people in this world who cannot see.And costs much to gain vision.This article enlightens how people can gain vision through "BIONIC EYE". A Ray Of Vision For The Blind
Bionic eye is an Artificial electronic eye.The electronic device which replaces functionality of a part or whole of the eye.The main purpose of bionic eye is to provide vision.The implant is a small chip that is surgically implanted behind the retina in the eye ball.There are two basic methods of bionic eyeMARC( Multiple Unit Artificial Retina Chip )ASR ( Artificial Silicon Retina system)
IT IS THE NEW SIDE OF LIFE SCIENCE WHICH CAN IMPROVE THE FUNCTIONALITY OF DAMAGE PART OF THE EYE , IN THIS PPT VARIOUS TECHNIQUES ARE DESCRIBED ON BIONIC EYE.
Eye Movement based Human Computer Interaction TechniqueJobin George
Eye movement-based interaction is one of several areas of current research in human computer interaction in which a new interface style seems to be emerging. In the non-command style, the computer passively monitors the user and responds as appropriate, rather than waiting for the user to issue specific commands. In describing eye movement-based human-computer interaction we can see two distinctions, one is in the nature of the user’s eye movements and the other, in the nature of the responses. In the world created by an eye movement based interface, users could move their eyes to scan the scene, just as they would a real world scene, unaffected by the presence of eye tracking equipment movement, on the eye movement axis. The alternative is to instruct users of the eye movement based interface to move their eyes in particular ways. On the response axis, objects could respond to a user’s eye movements in a natural way that is, the object responds to the user’s looking in the same way real objects do. The alternative is unnatural response, where objects respond in ways not experienced in the real world.
Now a days Eye tracking technology is applied in many fields like automotive defense and medical industries. The fields of advertising, entertainment, packaging and web design have all benefited significantly from studying the visual behavior of the consumer. Every day, as eye tracking is used in creative new ways, the list of applications grows.
An ocular prosthesis or artificial eye is a type of craniofacial prosthesis that replaces an absent eye following an enuleatin, evisceration, or orbital exenteration.
There are many people in this world who cannot see.And costs much to gain vision.This article enlightens how people can gain vision through "BIONIC EYE". A Ray Of Vision For The Blind
Assistive System Using Eye Gaze Estimation for Amyotrophic Lateral Sclerosis ...Editor IJCATR
Amyotrophic lateral sclerosis (ALS) patients cannot control their muscle except eyes in the later stage of the disease
progress. This paper aims to develop an eye-based assistive system that is controlled by the eye gaze to help ALS patients improve
their life quality. Two main functions are proposed in this paper. The first one is called HelpCall that can detect the users’ eye gaze to
active the corresponding events. ALS patients can “talk” with other people more easily by looking at specific buttons in the HelpCall
system. The second one is an eye-control browser that allows the users browsing web pages in Internet. We design an interface that
embeds the IE browser into several buttons controlled by the user eye gaze. ALS patients can visit the Internet only using their eyes in
our proposed system. This paper discusses our ideas for the assistive system and then describes the design and implementation of our
proposed system in details.
Real Time Eye Blinking and Yawning Detectionijtsrd
Detecting eye blink and yawning is important, for example in systems that monitor the vigilance of the human operator, eg Driver's drowsiness. Driver fatigue is one of the leading causes of the worlds deadliest road accidents. This shows that in the transport sector in particular, where a driver of heavy vehicles is often open to hours of monotonous driving which causes fatigue without frequent rest periods. It is therefore essential to design a road accident prevention system that can detect the drivers drowsiness, determine the drivers level of carelessness and warn when an imminent danger occurs. In this article, we propose a real time system that uses eye detection techniques, blinking and yawning. The system is designed as a non intrusive real time monitoring system. The priority is to improve driver safety without being intrusive. In this work, the blink of an eye and the drivers yawn are detected. If the drivers eyes remain closed for more than a certain time and the drivers mouth is open to yawning, the driver is said to be fatigue. Ohnmar Win "Real Time Eye Blinking and Yawning Detection" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-3 | Issue-5 , August 2019, URL: https://www.ijtsrd.com/papers/ijtsrd28004.pdfPaper URL: https://www.ijtsrd.com/engineering/electrical-engineering/28004/real-time-eye-blinking-and-yawning-detection/ohnmar-win
Losing the normal vision is the common problem
facing by human beings in this present world. These problems
occur when the image is not properly focused on retina. These
problems are usually corrected by using spectacles or contact
lens. To test the eye sight of the patient in the present system, we
have manual testing and computerized or Tablet based eye
testing. By using any of these techniques eye sight of the patient is
determined. In these type of traditional methods users are just
choosing the spectacles that is of stylish and suits them, even if
wearing of such spectacles is of no use to them. So once as it is
found that whether the patient is having the eye sight or not, even
if there is no eye sight and patient is interested to wear the
spectacles to protect his eyes while reading or watching the
screen then there should be an approach using which we can
suggest the person with customized progressive lens by
monitoring their observations whether a person is moving his
head or just eye to see object or screen.
This project is done by Digital Image processing and Computer
Vision based techniques and algorithms in a practical approach.
The main objective of this project is to design algorithm for eye
and head movement tracking device. Firstly that device is made
learnt about what does eye looks like and where it is located on
face by using some eye and head movement tracking algorithms.
Then patient is made to sit comfortably in the chair in front of
that device, such that the patient’s eyes are visible from the
camera and sensors view and he is made to wear some trial frame
which emits radiation from LED’s and then patient is allowed to
observe the LED light that are mounted on the screen and they
are designed to glow in some specific pattern. The motion of the
patient’s eye and head while observing the LED pattern is
continuously monitored by camera and sensors that are present
on the device and information is stored and processed. Based on
this information Eye Movements Region map is generated. Once
the eye and head movements region map is generated they are
combined to form a generalized region map. This map is then
used to suggest the patient with the customized progressive lens.
This is a story I wrote about the SPOT autorefractor for the AOANews. It features the staff at Lyons Family Eye Care and several others who are involved in its use and development.
Wireless USB products are finally arriving at the market and in this article you will learn more about this technology and see some usage examples. The goals of wireless USB is to connect peripherals such as printers, externals hard disk drives, sound cards, media players and even video monitors to the PC wirelessly. This can be done by to forms.
If the PC and the device don’t have native support to WUSB, you must install a dongle to convert are standard USB ports in to WUSB.
The maximum theoretical transfer rate from WUSB in this same as USB 2.0:480mpbs if the device is within 3 meters from the PC or 110mpbs from the PC. As you can see, the more distant the devices are from the PC, lower is the transfer rate.
Bluetooth is another wireless technology that allows the connection between the PC and peripherals without wires. At this time, however, Bluetooth is targeted to low speed devices only, as its maximum transfer rate is of 1 Mpbs (128 kb\s) or 3Mpbs (384MB\s), depending on the Bluetooth generation.
The next Bluetooth generation is scheduled to have the same transfer rate as WUSB, but this technology is not yet available. Wireless USB works on the UWB frequency range, while Bluetooth technology works on 2.4GHz frequency, the same use by IEEE 802.11 wireless network.
Wireless USB products are finally arriving at the market and in this article you will learn more about this technology and see some usage examples. The goals of wireless USB is to connect peripherals such as printers, externals hard disk drives, sound cards, media players and even video monitors to the PC wirelessly. This can be done by to forms.
If the PC and the device don’t have native support to WUSB, you must install a dongle to convert are standard USB ports in to WUSB.
The maximum theoretical transfer rate from WUSB in this same as USB 2.0:480mpbs if the device is within 3 meters from the PC or 110mpbs from the PC. As you can see, the more distant the devices are from the PC, lower is the transfer rate.
Bluetooth is another wireless technology that allows the connection between the PC and peripherals without wires. At this time, however, Bluetooth is targeted to low speed devices only, as its maximum transfer rate is of 1 Mpbs (128 kb\s) or 3Mpbs (384MB\s), depending on the Bluetooth generation.
The next Bluetooth generation is scheduled to have the same transfer rate as WUSB, but this technology is not yet available. Wireless USB works on the UWB frequency range, while Bluetooth technology works on 2.4GHz frequency, the same use by IEEE 802.11 wireless network.
IoT has become so vital in our daily life and it is going to create a big impact in the near future. For example, solutions can be provided instantly for the traffic flows, reminding about the vehicle maintenance, reduce energy consumption. Monitoring sensors will diagnose pending maintenance issues, and even prioritize maintenance crew schedules for repair equipment. Data analysis systems will help metropolitan and cosmopolitan cities to function easily in terms of traffic management, waste management, pollution control, law enforcement and other major functions efficiently.
British entrepreneur Kevin Ashton first coined the term in 1999 while working at Auto-ID Labs (originally called Auto-ID centers - referring to a global network of Radio-frequency identification (RFID) connected objects). Typically, IoT is expected to offer advanced connectivity of devices, systems, and services that goes beyond machine-to-machine communications (M2M) and covers a variety of protocols, domains, and applications. The interconnection of these embedded devices (including smart objects), is expected to usher in automation in nearly all fields, while also enabling advanced applications like a Smart Grid, and expanding to the areas such as smart cities.
"Things," in the IoT sense, can refer to a wide variety of devices such as heart monitoring implants, biochip transponders on farm animals, electric clams in coastal waters, automobiles with built-in sensors, DNA analysis devices for environmental/food/pathogen monitoring or field operation devices that assist firefighters in search and rescue operations. These devices collect useful data with the help of various existing technologies and then autonomously flow the data between other devices. Current market examples include smart thermostat systems and washer/dryers that use Wi-Fi for remote monitoring.
Considering it to the next level, linked devices can help the people personally like you get an alert from the refrigerator reminding you to shop some vegetables when the vegetable tray is empty, your home security systems enables you to open the door for some guest with help of connected devices(IoT).
While wireless communication technology today has become part of our daily life, the
idea of wireless undersea communications may still seem far-fetched. However, research has
been active for over a decade on designing the methods for wireless information transmission
underwater. Human knowledge and understanding of the world’s oceans, which constitute
the major part of our planet, rests on our ability to collect information from remote undersea
locations.
The major discoveries of the past decades, such as the remains of Titanic, or the hydrothermal
vents at bottom of deep ocean, were made using cabled submersibles. Although such
systems remain indispensable if high-speed communication link is to exists between the
remote end and the surface, it is natural to wonder what one could accomplish without the
burden (and cost) of heavy cables.
Hence the motivation, and interest in wireless underwater communications. Together with
sensor technology and vehicular technology, wireless communications will enable new
applications ranging from environmental monitoring to gathering of oceanographic data,
marine archaeology, and search and rescue missions.
Cloud computing is an internet-based computing technology, where shared re-sources
such as software, platform, storage and information are provided to customers on demand.
Cloud computing is a computing platform for sharing resources that include infrastructures,
software, applications, and business processes. The exact definition of cloud computing is A
large-scale distributed computing paradigm that is driven by economies of scale, in which a
pool of abstracted, virtualized, dynamically scalable, managed computing power, storage,
platforms, and services are delivered on demand to external customers over the Internet .
Cloud computing is an internet-based computing technology, where shared re-sources
such as software, platform, storage and information are provided to customers on demand.
Cloud computing is a computing platform for sharing resources that include infrastructures,
software, applications, and business processes. The exact definition of cloud computing is A
large-scale distributed computing paradigm that is driven by economies of scale, in which a
pool of abstracted, virtualized, dynamically scalable, managed computing power, storage,
platforms, and services are delivered on demand to external customers over the Internet .
While wireless communication technology today has become part of our daily life, the
idea of wireless undersea communications may still seem far-fetched. However, research has
been active for over a decade on designing the methods for wireless information transmission
underwater. Human knowledge and understanding of the world’s oceans, which constitute
the major part of our planet, rests on our ability to collect information from remote undersea
locations.
The major discoveries of the past decades, such as the remains of Titanic, or the hydrothermal
vents at bottom of deep ocean, were made using cabled submersibles. Although such
systems remain indispensable if high-speed communication link is to exists between the
remote end and the surface, it is natural to wonder what one could accomplish without the
burden (and cost) of heavy cables.
Overview of the fundamental roles in Hydropower generation and the components involved in wider Electrical Engineering.
This paper presents the design and construction of hydroelectric dams from the hydrologist’s survey of the valley before construction, all aspects and involved disciplines, fluid dynamics, structural engineering, generation and mains frequency regulation to the very transmission of power through the network in the United Kingdom.
Author: Robbie Edward Sayers
Collaborators and co editors: Charlie Sims and Connor Healey.
(C) 2024 Robbie E. Sayers
Saudi Arabia stands as a titan in the global energy landscape, renowned for its abundant oil and gas resources. It's the largest exporter of petroleum and holds some of the world's most significant reserves. Let's delve into the top 10 oil and gas projects shaping Saudi Arabia's energy future in 2024.
Cosmetic shop management system project report.pdfKamal Acharya
Buying new cosmetic products is difficult. It can even be scary for those who have sensitive skin and are prone to skin trouble. The information needed to alleviate this problem is on the back of each product, but it's thought to interpret those ingredient lists unless you have a background in chemistry.
Instead of buying and hoping for the best, we can use data science to help us predict which products may be good fits for us. It includes various function programs to do the above mentioned tasks.
Data file handling has been effectively used in the program.
The automated cosmetic shop management system should deal with the automation of general workflow and administration process of the shop. The main processes of the system focus on customer's request where the system is able to search the most appropriate products and deliver it to the customers. It should help the employees to quickly identify the list of cosmetic product that have reached the minimum quantity and also keep a track of expired date for each cosmetic product. It should help the employees to find the rack number in which the product is placed.It is also Faster and more efficient way.
Explore the innovative world of trenchless pipe repair with our comprehensive guide, "The Benefits and Techniques of Trenchless Pipe Repair." This document delves into the modern methods of repairing underground pipes without the need for extensive excavation, highlighting the numerous advantages and the latest techniques used in the industry.
Learn about the cost savings, reduced environmental impact, and minimal disruption associated with trenchless technology. Discover detailed explanations of popular techniques such as pipe bursting, cured-in-place pipe (CIPP) lining, and directional drilling. Understand how these methods can be applied to various types of infrastructure, from residential plumbing to large-scale municipal systems.
Ideal for homeowners, contractors, engineers, and anyone interested in modern plumbing solutions, this guide provides valuable insights into why trenchless pipe repair is becoming the preferred choice for pipe rehabilitation. Stay informed about the latest advancements and best practices in the field.
Hybrid optimization of pumped hydro system and solar- Engr. Abdul-Azeez.pdffxintegritypublishin
Advancements in technology unveil a myriad of electrical and electronic breakthroughs geared towards efficiently harnessing limited resources to meet human energy demands. The optimization of hybrid solar PV panels and pumped hydro energy supply systems plays a pivotal role in utilizing natural resources effectively. This initiative not only benefits humanity but also fosters environmental sustainability. The study investigated the design optimization of these hybrid systems, focusing on understanding solar radiation patterns, identifying geographical influences on solar radiation, formulating a mathematical model for system optimization, and determining the optimal configuration of PV panels and pumped hydro storage. Through a comparative analysis approach and eight weeks of data collection, the study addressed key research questions related to solar radiation patterns and optimal system design. The findings highlighted regions with heightened solar radiation levels, showcasing substantial potential for power generation and emphasizing the system's efficiency. Optimizing system design significantly boosted power generation, promoted renewable energy utilization, and enhanced energy storage capacity. The study underscored the benefits of optimizing hybrid solar PV panels and pumped hydro energy supply systems for sustainable energy usage. Optimizing the design of solar PV panels and pumped hydro energy supply systems as examined across diverse climatic conditions in a developing country, not only enhances power generation but also improves the integration of renewable energy sources and boosts energy storage capacities, particularly beneficial for less economically prosperous regions. Additionally, the study provides valuable insights for advancing energy research in economically viable areas. Recommendations included conducting site-specific assessments, utilizing advanced modeling tools, implementing regular maintenance protocols, and enhancing communication among system components.
1. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 1
APPROVED BY AICTE, NEW DELHI & AFFILIATED TO NORTH MAHARASHTRA
UNIVERSITY, JALGAON CERTIFIED BY ISO 9001:2008
Jamia Institute of Engineering & Management
Studies, Akkalkuwa
.
Department of Computer Engineering
Seminar report on
“Eye Gaze” by
Mr. Saba Karim
[BE COMPUTER]
2. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 2
Jamia Institute of Engineering & Management
Studies, Akkalkuwa
CERTIFICATE
This is to certify that the seminar report of seminar entitled, “Eye Gaze”,
being submitted by Mr. Saba Karim to Computer Engineering Department is a
record of bonafied work carried out by him under my supervision and guidance
during year 2018-2019.
Seminar Guide Head of Department
[Mohammad Asif] [Prof. Patel Suhel Ishaq]
I/c Principal
[Prof. Saiyed Irfan]
3. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 3
CHAPTER 1
INTRODUCTION
1.1 Introduction
Imagine yourself being a intelligent, motivated, and working person in the fiercely competitive
market of information technology, but just one problem You can't use your hands. Or you can't speak.
How do you do your job? How do you stay employed? You can, because of a very good gift from
computer Industry. The Eyegaze, a communication & control system you run with your eyes In humans,
gaze direction and ocular behaviour is probably one of the first distant means of communication
developed. Parents often try to understand what their baby looks at, and they deduce that the object
observed attracts his/her interest. This ability to interact with someone by a transitional object is named
joint attention
The Eyegaze System is a direct-select vision-controlled communication and control system. It was
developed in Fairfax, Virginia, by LC Technologies,
Fig:1.1 Eye gaze communication
5. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 5
This system is mainly developed for those who This system is mainly developed for those who
lack the use of their hands or voice. Only requirements to operate the Eye gaze are control of at least one
eye with good vision & ability to keep head fairly still. Eye gaze Systems are in use around the world. Its
users are adults and children with cerebral palsy, spinal cord injuries, brain injuries, ALS, multiple
sclerosis, brainstem strokes, muscular dystrophy, and Werdnig Hoffman syndrome. Eye gaze Systems
are being used in homes, offices, schools, hospitals, and long term care facilities. By looking at control
keys displayed on a screen, a person can synthesize speech, control his environment (lights, appliances,
etc.), type, operate a telephone, run computer software, operate a computer mouse, and access the Internet
and e-mail. Eye gaze Systems are being used to write books, attend school and enhance the quality of life
of people with disabilities all over the world.
1.2 The skills needed by the user:
1.1.1 Good control of one eye:
The user must be able to look up, down, left and right. He must be able to fix his gaze on all areas
of a 15-inch screen that is about 24 inches in front of his face. He must be able to focus on one spot for at
least 1/2 second. Several common eye movement problems may interfere with Eye gaze use. These
include:
Nystagmus (constant, involuntary movement of the eyeball): The user may not be able to fix his gaze long
enough to make eye gaze selections.
Alternating strabismus (eyes cannot be directed to the same object, either one deviates): The Eye gaze
System is constantly tracking the same single eye. If, for example, a user with alternating strabismus is
operating the Eyegaze System with the right eye, and that eye begins to deviate, the left eye will take over
and focus on the screen. The Eye gaze camera, however, will continue to take pictures of the right eye,
and the System will not be able to determine where the user's left eye is focused. When the left eye
deviates and the right eye is again fixed on the screen the Eyegaze System will resume predicting the gaze
point. Putting a partial eye patch over the nasal side of the eye not being observed by the camera often
solves this tracking problem. Since only the unpatched eye can the screen, it will continuously focus on
the screen. By applying only a nasal-side patch to the other eye, the user will retain peripheral vision on
that side.
6. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 6
1.2.2 Adequate vision:
Several common vision problems may affect a user's ability to see text clearly on the Eyegaze
monitor. These include the following:
1.2.3 Inadequate Visual acuity:
The user must be able to see text on the screen clearly. If, prior to his injury or the onset of his
illness he wore glasses, he may need corrective lenses to operate the Eyegaze System. If he's over 40
years old and has not had his vision checked recently, he might need reading glasses in order to see the
screen clearly. In most cases, eyetracking works well with glasses. The calibration procedure
accommodates for the refractive properties of most lenses. Hard-line bifocals can be a problem if the lens
boundary splits the image of the pupil, making it difficult for the system's image processing software to
determine the pupil center accurately. Graded bifocals, however, typically do not interfere with
eyetracking. Soft contact lenses that cover all or most of the cornea generally work well with the Eyegaze
System. The corneal reflection is obtained from the contact lens surface rather than the cornea itself.
Small, hard contacts can interfere, if the lens moves around considerably on the cornea and causes the
corneal reflection to move across the discontinuity between the contact lens and the cornea.
Diplopia (double vision):
Diplopia may be the result of an injury to the brain, or a side effect of many commonly prescribed
medications, and may make it difficult for the user to fix his gaze on a given point. Partially patching the
eye not being tracked may alleviate double vision during Eyegaze System operation.
1.2.4 Blurred vision:
Another occurrence associated with some brain injuries, as well as a side effect of medications, a
blurred image on the screen decreases the accuracy of eye fixations.
1.2.5 Cataracts (clouding of the lens of the eye):
If a cataract has formed on the portion of the lens that covers the pupil, it may prevent light from
passing through the pupil to reflect off the retina. Without a good retinal reflection the Eyegaze System
cannot accurately predict the user's eye fixations. The clouded lens may also make it difficult for a user to
see text on the screen clearly. Surgical removal of the cataracts will normally solve the problem and make
Eyegaze use possible.
1.2.6 Homonymous hemianopsia
(blindness or defective vision in the right or left halves of the visual fields of both eyes): This
may make calibration almost impossible if the user cannot see calibration points on one side of the screen.
7. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 7
1.3 Ability to maintain a position in front of the Eyegaze monitor:
It is generally easiest to run the System from an upright, seated position, with the head centered in
front of the Eyegaze monitor.However the Eyegaze System can be operated from a semi-reclined position
if necessary. Continuous, uncontrolled head movement can make Eyegaze operation difficult, since the
Eyegaze System must relocate the eye each time the user moves away from the camera's field of view and
then returns. Even though the System's eye search is completed in just a second or two, it will be more
tiring for a user with constant head movement to operate the System.
Absence of medication side effects that affect Eyegaze operation:
Many commonly prescribed medications have potential side effects that can make it difficult to operate
Eyegaze. Anticonvulsants (seizure drugs) can cause: nystagmus, blurred vision, diplopia, dizziness,
drowsiness, headache and confusion. Some antidepressants can cause blurred vision and mydriasis
(abnormally dilated pupil.) And Baclofen, a drug commonly used to decrease muscle spasms, can cause
dizziness, drowsiness, headache, disorientation, blurred vision and mydriasis. Mydriasis can be severe
enough to block eyetracking. If the retinal reflection is extremely bright, and the corneal reflection is
sitting on top of a big, bright pupil, the corneal reflection may be indistinguishable and therefore
unreadable by the computer.
1.4 Mental abilities that improve the probability for successful Eyegaze use:
Cognition: Cognitive level may be difficult to assess in someone who is locked in, especially if a
rudimentary communication system has not been established. In general, a user with average intelligence
will best maximize the capabilities of an Eyegaze System. Ability to read: At present, the Eyegaze
System is configured for users who are literate. The System is text-based. A young child with average
intelligence may not be reading yet, but probably has the capability to learn to read at an average age. He
may be able to recognize words, and may be moving his eyes in a left to right pattern in preparation for
reading. As an interim solution many teachers and parents stick pictures directly onto the screen. When
the child looks at the picture he activates the Eyegaze key that is located directly underneath it.
1.4.1 Memory:
Memory deficits are a particular concern in considering the Eyegaze System for someone with a
brain injury. A user who can't remember from one day to the next how to operate the system may find it
too difficult to use effectively.
8. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 8
CHAPTER 2
LITERATURE SERVEY
2.1 HISTARY
Eye and gaze tracking state of the art This section introduces the latest uses of eye gaze tracking
in applications, with a special focus on interactive applications and/or video games, but in order to
understand the present let’s have a look at the past. After the second world war, one of the first measures
of gaze direction was done in 1947 by the group of Fitts, Jones and Milton . They published technical
reports in the late 1940s that are considered to be the seminal research on visual sampling and represent
the largest collection of eye movement data collected in a visual monitoring task. The data encompass
over 500,000 frames of movie film of over 40 pilots taken under various flight conditions. The general
conclusions was that: It is reasonable to assume that frequency of eye fixations is an indication of the
relative importance of that instrument. The length of fixations, on the contrary, may more properly be
considered as an indication of the relative difficulty of checking and interpreting particular instruments.
[...] If we know where a pilot is looking, we do not necessarily know what he is thinking, but we know
something of what he is thinking about.
Following this work, authors were able to propose a more efficient arrangement of instruments and
identified those which were difficult to read, for a possible redesign of the actual instrument. This was the
first time a survey allowed interaction between an application (an airplane cockpit) and a manual gaze
tracking system. It was also the first time video was used to perform measures. Actually, they were
mainly performed using a medical technique that allowed registration of eyeball movements using a
number of electrodes hal-00215967, version 1 - 24 Jan 2008.
Left: The gaze tends to come and go between the eyes and mouth in the picture of a face. Right: Think
where you are looking at to see either a musician or a woman face. positioned around the eye. Most of the
described techniques required the viewer’s head to be motionless during eye tracking and used a variety
of invasive devices. The major innovation in eye tracking was the invention of a head-mounted eye
9. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 9
tracker ([13], [23], [24], [30]), this technique is still widely used. Another reference work in the gaze
tracking world is the one done by Yarbus. Yarbus was a Russian psychologist who studied eye
movements and saccadic exploration of complex images in the 1950s and 1960s. He recorded the eye
movements performed by observers while viewing natural objects and scenes. Here again, this work tends
to show that the gaze direction is crucial in interactivity, actually Yarbus showed that the gaze trajectories
followed depends on the task that the observer has to perform (cf. classical Figure and experience 2.2).
Eyes would concentrate on areas of the images of relevance to the questions. Much of the relevant work
in the 1970s focused on technical improvements to increase accuracy and precision and reduce the impact
of the trackers.In , Jacob proposes a brief history of the hal-00215967, version 1 - 24 Jan 2008 eye
tracking system during these years. Since this period, works are deeply correlated with the performance of
computers. The more it progresses, the more it provides the necessary resources for real time and/or
complex applications. For instance, it is nowadays possible to develop a human computer interface using
gaze tracking. During the 80’s, interest for gaze tracking have persisted, the incredible boom of personal
computers allowed to design new interfaces and ways of thinking our relation with the computer [5], [4]
and [31]. Another pioneer was Dr. Levine who was one of the first to see the potential of eye tracking in
interactive applications. For 15 years, eye and gaze tracking have become an industrial stake, many
manufacturers have developed products in this field of research. The dramatically increasing number of
publications around this topic prevents from doing an exhaustive state of the art, actually many journals,
conferences and publications have been created around this topic (conferences: ECEM - the European
Conference on Eye Movements, SWAET - the Scandinavian Workshop on Applied Eye-tracking, ETRA -
Eyetracking Research and Applications...). At least seven sponsors are represented in the last ETRA
conference. Nowadays, many systems, invasive or not, allow to measure, follow, analyse, and log in real
time numerous data coming from cameras. Applications are various, from military to advertisement
analysis via medical applications. A complete vision of theories and applications is presented in. With the
advances in eye gaze sensing technologies these systems are now much more precise and far less
intrusive. As a consequence, researches using eye gaze as input stream have grown increasingly. This
reality leads to the existence of several ways of tracking the direction of eye-gaze. [3], [27] and [10]
provide the following list of requirements of an ideal tracking device, which are still not fully satisfied by
current techniques.
1. Off er an unobstructed field of view with good access to the face and head
2. Make no contact with the subject
3. Meet the practical challenge of being capable of artificially stabilising the retinal image if necessary
4. Possess an accuracy of at least one percent or a few minutes of arc. Accuracy is limited by the
cumulative eff ects of non-linearity, distortion, noise, lag and other sources of error
10. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 10
5. Off er a resolution of 1 minute of arc sec−1, and thus be capable of detecting the smallest changes in
eye position; resolution is limited only by instrumental noise
hal-00215967, version 1 - 24 Jan 2008
6. Off er a wide dynamic range of one minute to 45◦ for eye position and one minute arc sec−1 to 800
sec−1 for eye velocity
7. Off er good temporal dynamics and speed of response (e.g. good gain and small phase shift to 100Hz,
or a good step response).
8. Possess a real-time response (to allow physiological manoeuvres).
9. Measure all three degrees of angular rotation and be insensitive to ocular translation
10. Be easily extended to binocular recording
11. Be compatible with head and body recordings
12. Be easy to use on a variety of subjects
To summarize this brief history, even if eye gaze tracking systems exist since a long time, the
tracking and measure of eye behaviour and gaze direction was until recently a very complex and
expensive task reserved for research or military labs. The eye tracking devices were uncomfortable head
mounted systems, thus they were mainly used as pointing devices for a very narrow range of applications
(mainly military). However, rapid technological advancements (increased processor speed, advanced
digital video processing) have both lowered the cost and dramatically increased the efficiency of eye and
gaze tracking equipment. In the next two sections, we present a review of eye movement tracking systems
and applications.
2.2 A review of eye and gaze tracking systems
The most widely used current designs are video-based eye trackers. Even if these techniques are
predominant, we have to mention in order to be complete, the electro-oculography tracking technique. It
is based on the fact that an electrostatic field exists when eyes rotate. By recording small diff erences in
the skin potential around the eye, the position of the eye can be estimated. Also, since this is done with
electrodes placed on the skin around the eye, this technique does not require a clear view of the eye. This
technique is rather troublesome though, and is not well-suited for everyday use, since it requires the close
contact of electrodes to the user, yet a recent application can be found in [1].hal-00215967, version 1 - 24
Jan 2008
11. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 11
Fig: 2.2 Eye Tracking system
Left:the Czech anatomist Jan Evangelista Purkyne (1787-1869) and his four images. Right:
representation of a bright corneal reflection of near infrared diodes light
Concerning video based eye trackers, one or two cameras focus on one or both eyes and record and/or
analyse their movements. In this section, we present the very simple and known taxonomy in which we
separate systems into two main categories: • Head-mounted systems; • non intrusive systems. Each one
splits into two others categories depending on the kind of light they use: • ambient light; • infrared or near
infrared light. Head-mounted systems are commonly composed of cameras (1, 2 or 3) and diode which
provide light. These systems have followed the same path as computers; smaller and faster (Figure 2.4).
Nowadays, it is quite easy to follow and analyse the four Purkinje images. Purkinje images are reflections
of objects from structure of the eye. There are at least four Purkinje images that are visible when looking
at an eye. The first Purkinje image (P1) is the reflection from the outer surface of the cornea. The second
one (P2) is the reflection from the inner surface of the cornea. The third one (P3) is the reflection from the
anterior surface of the lens and the last one (P4) is the reflection from the posterior surface of the lens [10]
Using light emitting diodes on head mounted system, it is possible to record several images which
represent the reflects of emitted light in the eyes.
12. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 12
Fig:2.2.1 Eye Traking system
used in the Dual Purkinje Image method). If the illumination is coaxial with the optical path then it
produces a bright pupil eff ect similar to red eye. If the illumination source is off set from the optical path,
then the pupil appears dark. Bright Pupil tracking creates greater iris/pupil contrast allowing more robust
eye tracking and more reliable tracking in lighting conditions ranging from total darkness to very bright.
However, bright pupil techniques are not eff ective for tracking outdoors as extraneous infrared sources
interfere with monitoring. Regarding technology, some eye tracking systems require the head to be stable
(for example, with a chin rest), and some function remotely and automatically track the head during
motion. Concerning frame rate acquisition, most use a sampling rate of at least 30Hz until 50/60 Hz. In
the field of neurobiology or in order to capture the detail of the very rapid eye movements during reading
some of them can run at 240, 350 or even 1000/1250 Hz.
The other main category of eye and gaze tracking system is the non intrusive one. It provides some
advantages compared to the head-mounted systems, some of the most obvious are: it should allow natural
head movements;it should be able to perform with a wide variety of eye shapes, contact lenses or glasses;
it should be real time. With the increasing processing speed of computers, it is now possible to analyse
digital videos of face and eye movements in order to provide reliable measures. We can mention three
main approaches to detect and measure in a non invasive way eye and gaze, depending on the king og
13. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 13
features they use he glint;a 3D-model;a local linear map network. The first one and the most commonly
used approach is to calculate the angle of the visual axis and the location of the fixation point on the
display surface by tracking the relative position of the pupil and a point of light reflected from the cornea,
i.e. the glint. Infrared light enhanced measures take advantage of the bright pupil eff ect (a recent study of
such a system may be found in. The second one consists in the use of serialized image processing
methods to detect face, pupils, mouth and nostrils, once these treatments are done, a 3D model is used to
evaluate face orientation and finally gaze direction is estimated using eyes images (a very recent work can
be found in. The last one is more marginal, it consists in the use of a neural networks of the local linear
map type which enables a computer to identify the head orientation of a user by learning from examples.
In each case two comments can be made: firstly infrared light may facilitate results and secondly
calibration is a real problem, either a model is adapted in real time or is build before the tracking and
adapted for a single person. The price range of most commercially available eye-trackers is between
$5000 and $60.000. In the next section, we present the most popular applications using eye and gaze
tracking systems.
2.3 A review of eye and gaze tracking applications
Silicon has some special chemical properties, especially in its crystalline form. An atom of silicon
has 14 electrons, arranged in three different shells. The first two shells- which hold two and eight
electrons respectively- are completely full. The outer shell, however, is only half full with just four
electrons (Valence electrons). A silicon atom will always look for ways to fill up its last shell, and to do
this, it will share electrons with four nearby atoms. It's like each atom holds hands with its neighbours,
except that in this case, each atom has four hands joined to four neighbours. That's what forms the
crystalline structure. The only problem is that pure crystalline silicon is a poor conductor of electricity
because none of its electrons are free to move about, unlike the electrons in more optimum conductors
like copper.
14. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 14
CHAPTER 3
System Development
3.1 Eyegaze Communication
In a culture dominated by visual images, most people use their eyes to obtain vast amounts of
information without the need for direct or close physical contact. A human recognizes the outside world
through the ability of nervous systems which construct internal visual representations of the outside
world. The eye is like a camera in that it has a set of lenses in the front (the cornea and the lens) that focus
images on a light-sensitive film (the retina) in the back [see photography]. The retina contains several
layers of nerve cells that analyze visual information before it ever leaves the eye. Signals from the retina
are transmitted via the optic nerve to a way station in the core of the brain called the geniculate body, then
to the primary visual cortex at the back of brain. Our image of the world is mapped topographically onto
the visual cortex.It is important to note that the internal perception of visual media is not only a reflection
of its physical properties, but also the changes induced by its transduction, filtering, and transformation
by the nervous system.It is the brain, and not the eye, that is the true organ of visual perception. Given the
brain's integral interpretive role in the construction of any complex visual impression, it is necessary to be
aware of how a human understands his or her physical environment as a perceived environment.
The term "gaze" is broadly used by media theorists to refer both to the ways in which viewers look at
images of people in any visual medium and to the gaze of those depicted in visual texts.The "gaze" is a
double-sided term. There must be someone to gaze and there may be someone to gaze back. To give the
gaze is to perceive that one is looking at an object. To set oneself at gaze is to expose oneself to view or
display oneself. Words for the agent of gazing are beholder, viewer, and occasionally spectator or
audience. Like a person, gaze also can be exchanged in a medium. Several key forms of gaze can be
identified in photographic, filmic or televisual tests, or in figurative graphic art based on who is doing the
looking: the spectator's gaze, the intra-diegetic gaze, the direct or extra-diegetic address to the viewer, and
the look of the camera. The antiquity of the discourse on gaze can be seen in such myths as that of the evil
eye and the gorgon Medusa, whose gaze could turn its object to stone. Folkloric representations of eyes
sought to protect their wearers from the power of the evil gaze. In the nineteenth century, the discourse on
the visually perceptual object was centered on an opposition between the optical and tactile senses. The
tactile sense placed us in contact with reality while the optical sense was regarded as the sense of the
intellect, the spirit, and the imagination. Impressionists and symbolists were attracted by the fact that
optical perception seemed to unite the subjectivity of artistic vision with the objectivity of the external
world. It survived in the work of critics of the mid-twentieth century who used formal criteria to interpret
such artistic movements as abstract expressionism. This discourse was continued, but replaced to a large
extent by the term "gaze." In early twentieth-century, German expressionism exploited the sense of power
15. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 15
in images that stared out at the viewer menacingly. The charisma of the gaze came to its peak in Hitler,
who prided himself on his hypnotic gaze. Jean-Paul Sartre's almost paranoid treatment of " le regard " (the
look) in his treatise on existential philosophy, Being and Nothingness , portrayed the state of being
watched as a threat to the self.
A late-twentieth century interest in the eye and the gaze has been largely investigated so far in terms of
psychoanalysis. According to Jacques Lacan, human recognition of the visual object is overlaid with mis-
recognition. In "Of the Gaze as object petit a " Lacan indicates some sort of outside observer; the imagery
petit a is the lure for the subject's desire.The embodiment of object petit a is what we may call the gaze.
According to Lacan, the subject's attempt to view the other must pass through the intermediary. The plane
mirror provides a virtual image that covers up the fundamental lack in the real image. Thus, the gaze
corresponds to desire, the desire for self-completion through the other. "The eye and the gaze--this is for
us the split in which the drive is manifested at the level of the scopic field."In this permutation the gaze is
the unattainable object of desire that seemed to make the other complete. However, for Lacan, it is
important to understand that the eye and gaze, although split, are part of the same person. Marshall
McLuhan, in his Understanding Media: The extensions Man, refers to the tragedy of Narcissus caused by
the misrecognition of his own image: "The Greek myth of Narcissus is directly concerned with a fact of
human experience, as the word Narcissus indicates. It is from the Greek word narcosis, or numbness. The
youth Narcissus mistook his own reflection in the water for another person. This extension of himself by
mirror numbed his perceptions until he became the servomechanism of his own extended or repeated
image. The nymph Echo tried to win his love with fragments of his own speech, but in vain. He was
numb. He had adapted to his extension of himself and had become a closed system." Lev Manovich notes
that Lacan emphasizes that perspective extends beyond the domain of the visible. Manovich points out
that Lacan reminds us that an image is anything defined "by the correspondences from one point to
another in space" and the idea that perspective is not only limited to sight but also functions in other
senses defines the classical discourses on perception: "The whole trick, the key presto!, of the classic
dialectic around perception, derives from the fact that it deals with geometric vision, that is to say, with
vision in so far as it situated in a space that is not in its essence the visual."
16. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 16
CHAPTER 4
Perform Analysis
4.1 How does the Eyegaze System work?
As a user sits in front of the Eyegaze monitor, a specialized video camera mounted below the
monitor observes one of the user's eyes. Sophisticated image- processing software in the Eyegaze System's
computer continually analyzes the video image of the eye and determines where the user is looking on the
screen. Nothing is attached to the user's head or body.
Fig:4.1 Eyegaze control monitor
In detail the procedure can be described as follows: The Eyegaze System uses the pupil-
center/corneal-reflection method to determine where the user is looking on the screen. An infrared-
sensitive video camera, mounted beneath the System's monitor, takes 60 pictures per second of the user's
eye. A low power, infrared light emitting diode (LED), mounted in the center of the camera's lens
illuminates the eye. The LED reflects a small bit of light off the surface of the eye's cornea. The light also
shines through the pupil and reflects off of the retina, the back surface of the eye, and causes the pupil to
appear white. The bright-pupil effect enhances the camera's image of the pupil and makes it easier for the
image processing functions to locate the center of the pupil. The computer calculates the person's
gazepoint, i.e., the coordinates of where he is looking on the screen, based on the relative positions of the
pupil center and corneal reflection within the video image of the eye. Typically the Eyegaze System
predicts the gazepoint with an average accuracy of a quarter inch or better. Prior to operating the
eyetracking applications, the Eyegaze System must learn several physiological properties of a user's eye
in order to be able to project his gazepoint accurately. The system learns these properties by performing a
calibration procedure. The user calibrates the system by fixing his gaze on a small yellow circle displayed
17. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 17
on the screen, and following it as it moves around the screen. The calibration procedure usually takes
about 15 seconds, and the user does not need to recalibrate if he moves away from the Eyegaze System
and returns later.
4.2 How to run the Eyegaze System?
A user operates the Eyegaze System by looking at rectangular keys that are displayed on the control
screen. To "press" an Eyegaze key, the user looks at the key for a specified period of time. The gaze
duration required to visually activate a key, typically a fraction of a second, is adjustable. An array of menu
keys and exit keys allow the user to navigate around the Eyegaze programs independently.
18. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 18
CHAPTER 5
CONCLUSION
5.1 Conclusion
Today, the human eye-gaze can be recorded by relatively unremarkable techniques. This thesis
argues that it is possible to use the eye-gaze of a computer user in the interface to aid the control of the
application. Care must be taken, though, that eye-gaze tracking data is used in a sensible way, since the
nature of human eye-movements is a combination of several voluntary and involuntary cognitive
processes.
The main reason for eye-gaze based user interfaces being attractive is that the direction of the eye-
gaze can express the interests of the user-it is a potential porthole into the current cognitive processes-and
communication through the direction of the eyes is faster than any other mode of human communication.
It is argued that eye-gaze tracking data is best used in multimodal interfaces where the user interacts with
the data instead of the interface, in so-called non-command user interfaces.
5.2 Advantages
1. Eye movement is faster than other current input media
2. No training or particular coordination is required for normal users
3. Can determine where the user’s interest is focused automatically
4. Helpful for usability studies to understand users interact with their environment
5.3 Dis Advantares
1. The equipment is expensive
2. Some users can't work with the equipment (for example if they wear contact lenses or have
long eye lashes)
3. Calibrating the equipment takes time; this problem may resultantly cause the user to deviate
from using the device.
19. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 19
5.4 Application
Every year more than 100,000 people are diagnosed with motor neurone diseases. Typically, even
when all other ways of communicating are either severely damaged or completely lost, the eyes still
function. Communication by Gaze Interaction (COGAIN) is a Network of Excellence designed
specifically to help people with these disabilities to communicate more effectively with eye gaze. At the
COGAIN stand you can see how this technology is used by a person who relies on it. Current eye
tracking equipment allows users to generate text on a computer by using eye gaze. Users are able to select
letters and numbers by looking at a keyboard on a screen with their eyes, and can construct words and
sentences that can be spoken aloud by the system. Using these systems both empowers and enables
people with disabilities as they can now communicate without the need for an assistant or helper, giving
the users greater freedom in their lives. A wide variety of disciplines use eye tracking techniques,
including cognitive science, psychology (notably psycholinguistics, the visual world paradigm), human-
computer interaction (HCI), marketing research and medical research (neurological diagnosis). Specific
applications include the tracking eye movement in language reading, music reading, human activity
recognition, the perception of advertising, and the playing of sport. Uses include:
Cognitive Studies
Medical Research
Laser refractive surgery
Human Factors
Computer Usability
Translation Process Research
Vehicle Simulators
In-vehicle Research
Training Simulators
5.5 SCOPE FOR FUTURE DEVELOPMENT
A non-intrusive system to localize the eyes and monitor fatigue was developed. Information about
the head and eyes position are obtained through various self-developed image processing algorithms.
During the monitoring, the system is able to decide whether the eyes are opened or closed. When the eyes
have been closed for two seconds, a warning signal is issued. In addition during monitoring, the system is
able to automatically detect any eye localizing error that might have occurred. In case of this type of error,
20. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 20
the system is able to recover and properly localize the eyes. The proposed system was tested on the real
driver images. The video image [480 x 640 SSSpixels] of 75 different test persons has been recorded
during several day, night and complex background at different places. The proposed system has two key
phases such as preprocessing and detecting eye from video images described in Chapter 5 and 6
respectively. In preprocessing, new enhanced technique is used to enhance the contrast of dark regions
and tested with existing algorithm. As per the results obtained in section 5.5, all the noises in the video
image are removed successfully. In second phase new techniques are used to extract eye from the
preprocessed image. It is also tested with standard existing method and the comparison results are shown
in Table 6.1 and 6.2. The eye pair can be selected successfully in most cases, shown in Fig 6.1, 6.2 and
6.3. From the Chapter 6, the false finding rate of drowsiness of Color cue and projection function.
5.5.1 We achieved the following:
DDDS achieves highly accurate and reliable detection of drowsiness. DDDS offers a non-
intrusive approach to detect drowsiness without the annoyance and interference. Processing, judges the
driver’s alertness level on the basis of continuous eye closures. The proposed system works in both day
time and night time conditions. All the drawbacks mentioned in section 2.5 have been eliminated. In
future, this prototype can be extended to give alarm before sleeping by calculating the heart beat measure
without physical disturbance i.e., non intrusive method using modified ECG methods. Usually in ECG
method key points of body (For example chest, head, wrist etc.,) are sticked with wire. In the extended
method, sticking wire may be avoided. This will lead us to a way to find out the optimum level of
drowsiness. Further, this prototype will be extended to monitor the reflect ray from eye using nano
camera. If the reflection ray is absent, then eye is closed otherwise eye is opened. We believe that this will
create a better opportunity to detect drowsiness.
21. JIEMS, Akkalkuwa Department of Computer Engineering
Eye Gaze Communication Page 21
References
1. Tharpe AM, Ashmead D, Sladen DP, Ryan HA, Rothpletz AM (2008) Visual attention and hearing
loss: Past and current perspectives. Journal of the American Academy of Audiology 19: 741–747.
2. Stivalet P, Moreno Y, Richard J, Barraud PA, Raphel C (1998) Differences in visual search tasks
between congenitally deaf and normally hearing adults. Cognitive Brain Research 6: 227–232.
3. Proksch J, Bavelier D (2002) Changes in the spatial distribution of visual attention after early deafness.
Journal of Cognitive Neuroscience 14: 687–701.
4. Bavelier D, Dye MWG, Hauser PC (2006) Do deaf individuals see better? Trends in Cognitive
Sciences 10: 512–518.
5. Dye MWG, Hauser PC, Bavelier D (2008) Visual skills and cross-modal plasticity in deaf readers:
Possible implications for acquiring meaning from print. Annals of the New York Academy of Sciences
1145: 71–82.
6. Dye MWG, Hauser PC, Bavelier D (2009) Is visual selective attention in deaf individuals enhanced or
deficient? The case of the useful field of view. PLOS One 4(5): e5640.
7. McCullough S, Emmorey K (1997) Face processing by deaf ASL signers: evidence for expertise in
distinguishing local features. Journal of Deaf Studies and Deaf Education 2: 212–222.
8. Bettger J, Emmorey K, McCullough S, Bellugi U (1997) Enhanced facial discrimination: effects of
experience with American sign language. Journal of Deaf Studies and Deaf Education 2: 223–233. 9.
Kubota Y, Que´rel C, Pelion F, Laborit J, Laborit MF, et al. (2003) Facial affect recognition in pre-
lingually deaf people with schizophrenia. Schizophrenia Research 61: 265–270.
10. Yarbus AL (1965) Role of Eye Movements in the Visual Process. Moscow: Nauka.
11. Findlay JM, Gilchrist ID (2003) Active Vision - The Psychology of Looking and Seeing. Oxford,
UK: Oxford University Press.
12. Walker-Smith G, Gale A, Findlay J (1977) Eye movement strategies involved in face perception.
Perception 6: 313–326.
13. Janik SW, Wellens AR, Goldberg ML, Dell’Osso LF (1978) Eyes as the center of focus in the visual
examination of human faces. Perceptual and Motor Skills 47: 857–858.
14. Groner R, Walder F, Groner M (1984) Looking at faces: local and global aspects of scanpaths. In:
Gale AG, Johnson F, eds. Theoretical and applied aspects of eye movements research. Amsterdam:
Elsevier. pp 523–533.
15. Henderson JM, Williams CC, Falk RJ (2005) Eye movements are functional during face learning.
Memory & Cognition 33: 98–106.