SlideShare a Scribd company logo
BLUE EYES TECHNOLOGY
College: Vasireddy Venkatadri Institute Of Technology
Dept: Computer Science Engineering
S. Jaya Sindhura(12BQ1A0595)
M. Sujana(12BQ1A0566)
ABSTRACT
Is it possible to create a computer, which can
interact with us as we interact each other? For
example imagine in a fine morning you walk on
to your computer room and switch on your
computer, and then it tells you “Hey friend,
good morning you seem to be in a bad mood
today. And then it Opens your mail box and
shows you some of the mails and tries to cheer
you. It seems to be a fiction, but it will be the
life lead by “BLUE EYES” in the very near
future. The basic idea behind this technology
is to give the computer the human power. We
all have some perceptual abilities. That is we can
Understand each other’s feelings. For example
we can understand one’s emotional state by
analyzing his facial expression. If we add these
perceptual abilities of human to computers
would enable computers to work together with
human beings as intimate partners. The “BLUE
EYES” technology aims at creating
computational machines that have perceptual
and sensory ability like those of human beings.
How can we make computers “see” and “feel”?
BLUE EYES uses sensing technology to
identify a user’s actions and to extract key
information. For Example: In Future a BLUE
EYES-enabled television could become active
when the user makes eye contact at which
point the user could then tell the television to
“turn on”. This paper is about the software,
benefits and interconnection of various parts
involved in the “BLUE EYE” technology.
INTRODUCTION
Imagine yourself in a world where humans
interact with computers. You are sitting in front
of your personal computer that can listen, talk,
or even scream aloud. It has the ability to gather
information about you and interact with you
through special techniques like facial
recognition, speech recognition, etc. It can even
understand your emotions at the touch of the
mouse. It verifies your identity, feels your
presents, and starts interacting with you.
The BLUE EYES technology aims at creating
computational machines that have perceptual
and sensory ability like those of human beings.
It uses non-obtrusive sensing method,
employing most modern video cameras and
microphones to identify the user’s actions
through the use of imparted sensory abilities.
The machine can understand what a user wants,
where he is looking at, and even realize his
physical or emotional states.
The primary objective of the research is to give a
computer the ability of the human being to asses
a situation by using the senses of sight, hearing,
and touch. The BLUEEYES project aims at
creating computational devices with the sort of
perceptual abilities that people take for granted.
Thus BLUEEYES are the technology to make
computers sense and understand human behavior
and feelings and react in the proper ways.
AIMS:
1) To design smarter devices
2) To create devices with emotional intelligence
3) To create computational devices with
Perceptual abilities.
TRACKS USED
Our emotional changes are mostly reflected in
our heart pulse rate, breathing rate, facial
expressions, eye movements, voice etc. Hence
these are the parameters on which blue
technology is being developed. Making
computers see and feel Blue Eyes uses sensing
technology to identify a user's actions and to
extract key information. This information is then
analyzed to determine the users physical,
emotional, or informational state, which in turn
can be used to help make the user more
productive by performing expected actions or by
providing expected information
TECHNOLOGIES USED
The process of making emotional computers
with sensing abilities is known as affective
computing. The steps used in this are:
1) Giving sensing abilities.
2) Detecting human emotions.
3) Respond properly
The first step, is to give machines the equivalent
of the eyes, ears, and other sensory organs that
humans use to recognize and express emotion.
computer scientists are exploring a variety of
mechanisms including voice-recognition
software that can discern not only what is being
said but the tone in which it is said; cameras that
can track subtle facial expressions, eye
movements, and hand gestures; and biometric
sensors that can measure body temperature,
blood pressure, muscle tension, and other
physiological signals associated with emotion.
In the second step, the computers have to detect
even the minor variations of our moods. For e.g.
person may hit the keyboard very fast either in
the happy mood or in the angry mood.
In the third step the computers have to react in
accordance MJ with the emotional states.
Various methods of accomplishing affective
Computing
1) MAGIC POINTING
2) SUITOR
3) EMOTIONAL MOUSE
4)ARTIFICIAL INTELLIGENCE AND
SPEECH RECOGNITION
MAGIC POINTING:
MAGIC stands for Manual Acquisition with
Gaze Tracking Technology. a computer with this
technology could move the cursor by following
the direction of the user's eyes. This type of
technology will enable the computer to
automatically transmit information related to the
screen that the user is gazing at. Also, it will
enable the computer to determine, from the
user's expression, if he or she understood the
information on the screen, before automatically
deciding to proceed to the next program. The
user pointing is still done by the hand, but the
cursor always appears at the right position as if
by MAGIC. By varying input technology and
eye tracking, we get MAGIC pointing. Gaze
tracking has long been considered as an
alternative or potentially superior pointing
method for computer input.
Two specific MAGIC pointing techniques, one
conservative and one liberal, were designed,
analyzed, and implemented with an eye tracker
one has to be conscious of where one looks and
how long one looks at an object. If one does not
look at a target continuously for a set threshold
(e.g., 200ms), the target will not be successfully
selected
.
Once the cursor position had been redefined, the
user would need to only make a small movement
to, and click on, the target with a regular manual
input device. We have designed two MAGIC
pointing techniques, one liberal and the other
conservative in terms of target identification and
cursor placement.
The liberal MAGIC pointing technique: cursor is
placed in the vicinity of a target that the user
fixates on. Actuate input device, observe the
cursor position and decide in which direction to
steer the cursor. The cost to this method is the
increased manual movement amplitude.
The conservative MAGIC pointing technique
with "intelligent offset" To initiate a pointing
trial, there are two strategies available to the
user. One is to follow "virtual inertia:" move
from tie cursor's current position towards the
new target the user is looking at. This is likely
the strategy the user will employ, due to the way
the user interacts with today's interface. The
alternative strategy, which may be more
advantageous but takes time to learn, is to ignore
the previous cursor position and make a motion
which is most convenient and least effortful to
the user for a given input device.
The goal of the conservative MAGIC pointing
method is the following. Once the user looks at a
target and moves the input device, the cursor
will appear "out of the blue" in motion towards
the target, on the side of the target opposite to
the initial actuation vector. In comparison to the
liberal approach, this conservative approach has
both pros and cons. While with this technique
the cursor would never be over-active and jump
to a place the user does not intend to acquire, it
may require more hand-eye coordination effort.
MAGIC pointing techniques offer the following
potential advantages:
1.Reduction of manual stress and fatigue, since
the cross screen long-distance cursor
movement is eliminated from manual control.
2.Practical accuracy level. In comparison to
traditional pure gaze pointing whose accuracy
is fundamentally limited by the nature of eye
movement, the MAGIC pointing techniques let
the hand complete the pointing task, so they
can be as accurate as any other manual input
techniques.
3. A more natural mental model for the user. The
user does not have to be aware of the role of
the eye gaze. To the user, pointing continues
to be a manual task, with a cursor
conveniently appearing where it needs to be.
4. Speed. Since the need for large magnitude
pointing operations is less than with pure
manual cursor control, it is possible that
MAGIC pointing will be faster than pure
manual pointing.
5. Improved subjective speed and ease-of-use.
Since the manual pointing amplitude is smaller,
the user may perceive the MAGIC pointing
system to operate faster and more pleasantly
than pure manual control, even if it operates at
the same speed or more slowly.
ADVANTAGES in liberal conservative
approach:
1) Reduction of manual stress and fatigue
2) Practical accuracy level
3) A more natural mental model for the user
4) Faster than pure manual pointing
5) Improved subjective speed and ease of use
DISDVANTAGES in liberal conservative
approach:
1)Liberal approach is distracting when the user
is trying to read
2)The motor action computation cannot start
until the cursor appears
3)In conservative approach, uncertainty of the
exact location prolong the target acquisition
time.
EYE TRACKER
The liberal MAGIC pointing technique: the
curser is placed in the vicinity of the target that
the user fixates on
The conservative MAGIC pointing technique
with “intelligent offset”
Figure 4.3.Bright (left) and dark (right) pupil
images resulting from on-axis and off-axis
illumination. The glints, or corneal reflections,
from the on- and off-axis light sources can be
easily identified as the bright points in the iris.
Eye tracking data can be acquired
simultaneously with MRI scanning using a
system that illuminates the left eye of a subject
with an infrared (IR) source, acquires a video
image of that eye, locates the corneal reflection
(CR) of the IR source, and in real time
calculates/displays/records the gaze direction
and pupil diameter.
Once the pupil has been detected, the corneal
reflection is determined from the dark pupil
image. The reflection is then used to estimate the
user's point of gaze in terms of the screen
coordinates where the user is looking at. An
initial calibration procedure, similar to that
required by commercial eye trackers.
2)SUITOR
SUITOR stands for Simple User Interface
Tracker. It implements the method for putting
computational devices in touch with their users
changing moods. By watching what we page the
user is currently browsing, the SUITOR can find
additional information on that topic. The key is
that the user simply interacts with the computer
as usual and the computer infers user interest
based on what it sees the user do.
3)EMOTION MOUSE
This is the mouse embedded with sensors that
Can state the
physiological attributes such as temperature,
Body pressure, pulse rate, and touching style,
etc. The computer can determine the user’s
emotional states by a single touch. IBM is still
Performing research on this mouse and will be
available in the market within the next two or
three years. The expected accuracy is 75%.
One goal of human computer interaction (HCI)
is to make an adaptive, smart computer system.
In order to start creating smart computers, the
computer must start gaining information about
the user. One proposed method for gaining user
information through touch is via a computer
input device, the mouse. From the physiological
data obtained from the user, an emotional state
may be determined which would then be related
to the task the user is currently doing on the
computer. Over a period of time, a user model
will be built in order to gain a sense of the
user's personality.
By matching a person’s emotional state and the
context of the expressed emotion, over a period
of time the person’s personality is being
exhibited. Therefore, by giving the computer a
longitudinal understanding of the emotional state
of its user, the computer could adapt a working
style which fits with its user’s personality. The
result of this collaboration could increase
productivity for the user. One way of gaining
information from a user non-intrusively is by
video. Cameras have been used to detect
a person’s emotional state. We have explored
gaining information through touch. One obvious
place to put sensors is on the mouse.
EXPERIMENT
Based on Paul Elman’s facial expression work,
we see a correlation between a person’s
emotional state and a person’s physiological
measurements. Selected works from Elman and
others on measuring facial behaviors describe
Elman’s Facial Action Coding System (Elman
and Rosenberg, 1997).
One of his experiments involved participants
attached to devices to record certain
measurements including pulse, galvanic skin
response (GSR), temperature, somatic
movement and blood pressure. He then recorded
the measurements as the participants were
instructed to mimic facial expressions which
corresponded to the six basic emotions. He
defined the six basic emotions as anger, fear,
sadness, disgust, joy and surprise. From this
work, Dryer (1993) determined how
physiological measures could be used to
distinguish various emotional states. The
measures taken were GSR, heart rate, skin
temperature and general somatic activity (GSA).
These data were then subject to two analyses.
For the first analysis, a multidimensional
scaling. (MDS) procedure was used to determine
the dimensionality of the data.
RESULT
The data for each subject consisted of scores for
four physiological assessments [GSA, GSR,
pulse, and skin temperature, for each of the six
emotions (anger, disgust, fear, happiness,
sadness, and surprise)] across the five
minute baseline and test sessions. GSA data was
sampled 80 times per second, GSR and
temperature were reported approximately 3-
4times per second and pulse was recorded as a
beat was detected, approximately 1 time per
second
4)ARTIFICIAL INTELLIGENCE
Artificial intelligence (Al) involves two basic
ideas. First, it involves studying the thought
processes of human beings. Second, it deals with
representing those processes via machines (like
computers, robots, etc). Al is behavior of a
machine, which, if performed by a human being,
would be called intelligent. It makes machines
smarter and more useful, and is less expensive
than natural intelligence.
Natural language processing (NLP) refers to
artificial intelligence methods of communicating
with a computer in a natural language like
English. The main objective of a NLP program
is to understand input and initiate action. The
input words are scanned and matched against
internally stored known words. Identification of
a key word causes some action to be taken. In
this way, one can communicate with the
computer in one's language. No special
commands or computer language are required.
There is no need to enter programs in a special
language for creating software.
SPEECH RECOGNITION:
The user speaks to the computer through a
microphone, which, in used; a simple system
may contain a minimum of three filters. The
more the number of filters used, the higher the
probability of accurate recognition. The filter
output is then fed to the ADC to translate the
analogue signal into digital word. The ADC
samples the filter outputs many times a second.
Each sample represents different amplitude of
the signal .The spoken words are processed by
the filters and ADCs. The binary representation
of each of these words becomes a template or
standard, against which the future words are
compared. These templates are stored in the
memory. Once the storing process is completed,
the system can go into its active mode and is
capable of identifying spoken words. As each
word is spoken, it is converted into binary
equivalent and stored in RAM. The computer
then starts searching and compares the binary
input pattern with the templates, t is to be noted
that even if the same speaker talks the same text,
there are always slight variations in amplitude or
loudness of the signal, pitch, frequency
difference, time gap, etc. Due to this reason,
there is never a perfect match between the
template and binary input word. The pattern
matching process therefore uses statistical
techniques and is designed to look for the best
fit.
The values of binary input words are subtracted
from the corresponding values in the templates.
If both the values are same, the difference is
zero and there is perfect match. If not, the
subtraction produces some difference or error.
The smaller the error, the better the match.
When the best match occurs, the word is
identified and displayed on the screen.
BLUE EYES HARDWARE
DATA ACQUISITION UNIT (DAU):
1) Main task is to fetch the physiological data
2) Sensor will send data to the central system to
be processed.
3) ID cards and PIN codes provide operator's
authorization.
JAZZ MULTI SENSOR:
1) It supplies Raw digital data regarding eye
position, level of blood oxygenation.
2) Eye movement is measured using direct
infrared transducers.
CENTRAL SYSTEM UNIT
1) The box contains a Bluetooth module and
PCM codec for voice data transmission
2) Unit maintains the other side of the blue tooth
connection, buffers incoming sensor data,
performs on-line data analysis.
CONNECTION MANAGER:
The Connection Manager Handles:
1) Communication with the CSU hardware
2) Searching for new devices in the covered
range.
3) Establishing Bluetooth connections
4) Connection authentication
5) Incoming data buffering
6) Sending alerts
DATA ANALYSIS MODULE:
Performs the analysis of the raw sensor data,
in order to obtain information about the
physiological condition.
DATA LOGGER MODULE:
1)The raw or processed physiological data,alerts
and operator’s voice are stored.
2) A Voice Data Acquisition module delivers the
voice data.
VISUALIZATION MODULE:
1)Enables them to watch each of the working
operator’s physiological condition along with
a preview of selected video source and
related sound stream.
2)The Visualization module can be set in an off-
line mode, where all the data is fetched from
the database.
BLUETOOTH technology provides means for
creating personal area network linking DAU and
central system unit.
BLUE EYES enabled devices:
POD:
The first blue Eye enabled mass production
device was POD, the car manufactured y
TOYOTA. It could keep the driver alert and
active. It could tell the driver to go slow if he is
driving too fast and it could pull over the driver
when he feels drowsy. Also it could hear the
driver some sort of interesting music when he is
getting bored.
PONG:
IBM released a robot designed for
demonstrating the new technology. The Blue
Eyes robot is equipped with a computer capable
of analyzing a person's glances and other forms
of expressions of feelings, before automatically
determining the next type of action. IBM has
released a robot called PONG, which is
equipped with the Blue Eyes technology. PONG
is capable of perceiving the person standing in
front of it, smiles when the person calls his
name, and expresses loneliness when it loses
sight of the person.
APPLICATIONS
GENERIC CONTROL ROOMS:
Power stations
Flight control centers
AUTOMOBILE INDUSTRY:
The user can concentrate on observation and
manual operations, and still control the
machinery by voice input commands.
FLIGHT CONTROL CENTERS:
With reliable speech recognition equipment,
pilots can give commands and information to the
computers by simply speaking into their
microphones—they don’t have to use their
hands for this purpose.
AIRFORCE AND MILITARY:
To control weapons by voice commands
ADVANTAGES
1) Faster than Pure manual pointing
2) Improved subjective speed and case of use
3) Practical accuracy level
4) SUITOR
Computers would have been much more
powerful, had they gained perceptual and
sensory abilities of the living beings on the
earth. What needs to be developed is an intimate
relationship between the computer and the
humans. And the Simple User Interest Tracker
(SUITOR) is a revolutionary approach in this
direction.
A car equipped with an affective computing
system could recognize when a driver is feeling
drowsy and advise her to pull over, or it might
sense when a stressed-out motorist is about to
explode and warn him to slow down and cool
off.
A computer endowed with emotional
intelligence, on the other hand, could recognize
when its operator is feeling angry or frustrated
and try to respond in an appropriate fashion.
Such a computer might slow down or replay a
tutorial program for a confused student, or
recognize when a designer is frustrated or vexed
and suggest he take a break.
DISADVANTAGES
1) Liberal approach is Distracting when the user
is trying to read.
2) The motor action Computation cannot start
until the cursor appears.
FUTURE ENHANCEMENTS
1) In the future, ordinary household devices-
such as televisions, Ovens may be able to do
their jobs when we look at them and speak to
them.
2) Future applications of blue eye technology is
limitless
CONCLUSION
1) Provide more delicate and user friendly
facilities in computing devices
2) Gap between the electronic and physical
world is reduced
3) The computers can be run using implicit
commands instead of the explicit commands
ABSTRACT

More Related Content

What's hot

Niknewppt
NiknewpptNiknewppt
14 568
14 56814 568
HAND GESTURE RECOGNITION FOR HCI (HUMANCOMPUTER INTERACTION) USING ARTIFICIAL...
HAND GESTURE RECOGNITION FOR HCI (HUMANCOMPUTER INTERACTION) USING ARTIFICIAL...HAND GESTURE RECOGNITION FOR HCI (HUMANCOMPUTER INTERACTION) USING ARTIFICIAL...
HAND GESTURE RECOGNITION FOR HCI (HUMANCOMPUTER INTERACTION) USING ARTIFICIAL...
International Journal of Technical Research & Application
 
IRJET- Enhanced Look Based Media Player with Hand Gesture Recognition
IRJET-  	  Enhanced Look Based Media Player with Hand Gesture RecognitionIRJET-  	  Enhanced Look Based Media Player with Hand Gesture Recognition
IRJET- Enhanced Look Based Media Player with Hand Gesture Recognition
IRJET Journal
 
Sixth sense technology
Sixth sense technologySixth sense technology
Sixth sense technology
Renjith Ravi
 
Media Player with Face Detection and Hand Gesture
Media Player with Face Detection and Hand GestureMedia Player with Face Detection and Hand Gesture
Media Player with Face Detection and Hand Gesture
IRJET Journal
 
Sixth sense report
Sixth sense reportSixth sense report
Sixth sense report
RAJASHREE B
 
Seminar report on blue eyes
Seminar report on blue eyesSeminar report on blue eyes
Seminar report on blue eyes
Roshmi Sarmah
 
Sixth sensing robot
Sixth sensing robotSixth sensing robot
Sixth sensing robot
Ritesh Dwivedi
 
Sixth sense technology
Sixth sense technologySixth sense technology
Sixth sense technology
Surakshitha Rebba
 
Natural Hand Gestures Recognition System for Intelligent HCI: A Survey
Natural Hand Gestures Recognition System for Intelligent HCI: A SurveyNatural Hand Gestures Recognition System for Intelligent HCI: A Survey
Natural Hand Gestures Recognition System for Intelligent HCI: A Survey
Editor IJCATR
 
Gesture recognition
Gesture recognitionGesture recognition
Gesture recognition
Mariya Khan
 
Part 1 - Gesture Recognition Technology
Part   1 - Gesture Recognition TechnologyPart   1 - Gesture Recognition Technology
Part 1 - Gesture Recognition Technology
Patel Saunak
 
Hand Segmentation Techniques to Hand Gesture Recognition for Natural Human Co...
Hand Segmentation Techniques to Hand Gesture Recognition for Natural Human Co...Hand Segmentation Techniques to Hand Gesture Recognition for Natural Human Co...
Hand Segmentation Techniques to Hand Gesture Recognition for Natural Human Co...
Waqas Tariq
 
Review of methods and techniques on Mind Reading Computer Machine
Review of methods and techniques on Mind Reading Computer MachineReview of methods and techniques on Mind Reading Computer Machine
Review of methods and techniques on Mind Reading Computer Machine
Madhavi39
 
Gesture recognition
Gesture recognitionGesture recognition
Gesture recognition
PrachiWadekar
 
Gesture recognition using artificial neural network,a technology for identify...
Gesture recognition using artificial neural network,a technology for identify...Gesture recognition using artificial neural network,a technology for identify...
Gesture recognition using artificial neural network,a technology for identify...
NidhinRaj Saikripa
 
Sixth sense tecnology.
Sixth sense tecnology.Sixth sense tecnology.
Sixth sense tecnology.
Appam Sushma
 
Hand gesture recognition
Hand gesture recognitionHand gesture recognition
Hand gesture recognition
Muhammed M. Mekki
 
gesture-recognition
gesture-recognitiongesture-recognition
gesture-recognition
Venkat RAGHAVENDRA REDDY
 

What's hot (20)

Niknewppt
NiknewpptNiknewppt
Niknewppt
 
14 568
14 56814 568
14 568
 
HAND GESTURE RECOGNITION FOR HCI (HUMANCOMPUTER INTERACTION) USING ARTIFICIAL...
HAND GESTURE RECOGNITION FOR HCI (HUMANCOMPUTER INTERACTION) USING ARTIFICIAL...HAND GESTURE RECOGNITION FOR HCI (HUMANCOMPUTER INTERACTION) USING ARTIFICIAL...
HAND GESTURE RECOGNITION FOR HCI (HUMANCOMPUTER INTERACTION) USING ARTIFICIAL...
 
IRJET- Enhanced Look Based Media Player with Hand Gesture Recognition
IRJET-  	  Enhanced Look Based Media Player with Hand Gesture RecognitionIRJET-  	  Enhanced Look Based Media Player with Hand Gesture Recognition
IRJET- Enhanced Look Based Media Player with Hand Gesture Recognition
 
Sixth sense technology
Sixth sense technologySixth sense technology
Sixth sense technology
 
Media Player with Face Detection and Hand Gesture
Media Player with Face Detection and Hand GestureMedia Player with Face Detection and Hand Gesture
Media Player with Face Detection and Hand Gesture
 
Sixth sense report
Sixth sense reportSixth sense report
Sixth sense report
 
Seminar report on blue eyes
Seminar report on blue eyesSeminar report on blue eyes
Seminar report on blue eyes
 
Sixth sensing robot
Sixth sensing robotSixth sensing robot
Sixth sensing robot
 
Sixth sense technology
Sixth sense technologySixth sense technology
Sixth sense technology
 
Natural Hand Gestures Recognition System for Intelligent HCI: A Survey
Natural Hand Gestures Recognition System for Intelligent HCI: A SurveyNatural Hand Gestures Recognition System for Intelligent HCI: A Survey
Natural Hand Gestures Recognition System for Intelligent HCI: A Survey
 
Gesture recognition
Gesture recognitionGesture recognition
Gesture recognition
 
Part 1 - Gesture Recognition Technology
Part   1 - Gesture Recognition TechnologyPart   1 - Gesture Recognition Technology
Part 1 - Gesture Recognition Technology
 
Hand Segmentation Techniques to Hand Gesture Recognition for Natural Human Co...
Hand Segmentation Techniques to Hand Gesture Recognition for Natural Human Co...Hand Segmentation Techniques to Hand Gesture Recognition for Natural Human Co...
Hand Segmentation Techniques to Hand Gesture Recognition for Natural Human Co...
 
Review of methods and techniques on Mind Reading Computer Machine
Review of methods and techniques on Mind Reading Computer MachineReview of methods and techniques on Mind Reading Computer Machine
Review of methods and techniques on Mind Reading Computer Machine
 
Gesture recognition
Gesture recognitionGesture recognition
Gesture recognition
 
Gesture recognition using artificial neural network,a technology for identify...
Gesture recognition using artificial neural network,a technology for identify...Gesture recognition using artificial neural network,a technology for identify...
Gesture recognition using artificial neural network,a technology for identify...
 
Sixth sense tecnology.
Sixth sense tecnology.Sixth sense tecnology.
Sixth sense tecnology.
 
Hand gesture recognition
Hand gesture recognitionHand gesture recognition
Hand gesture recognition
 
gesture-recognition
gesture-recognitiongesture-recognition
gesture-recognition
 

Similar to ABSTRACT

20922174-Blue-Eyes-Technology.pdf
20922174-Blue-Eyes-Technology.pdf20922174-Blue-Eyes-Technology.pdf
20922174-Blue-Eyes-Technology.pdf
Yasmin297583
 
Blue Eyes Technology
Blue Eyes TechnologyBlue Eyes Technology
Blue Eyes Technology
Colloquium
 
Blue Eyes Technology
Blue Eyes TechnologyBlue Eyes Technology
Blue Eyes Technology
Ramki M
 
Blue eyes
Blue eyesBlue eyes
Blue eye technology
Blue eye technologyBlue eye technology
Blue eye technology
arslan895
 
Blue eyes technology
Blue eyes technologyBlue eyes technology
Blue eyes technology
Himadri khali
 
A Seminar Report On Blue Eyes Technology Submitted By
A Seminar Report On Blue Eyes Technology Submitted ByA Seminar Report On Blue Eyes Technology Submitted By
A Seminar Report On Blue Eyes Technology Submitted By
Jennifer Daniel
 
Blue eyes technology New Version 2017
Blue eyes technology New Version 2017Blue eyes technology New Version 2017
Blue eyes technology New Version 2017
Ajith Kumar Ravi
 
Gesture Recognition
Gesture RecognitionGesture Recognition
Gesture Recognition
Murlidhar Sarda
 
Cursor Movement with Eyeball
Cursor Movement with EyeballCursor Movement with Eyeball
Cursor Movement with Eyeball
IRJET Journal
 
Hand Gesture Recognition System for Human-Computer Interaction with Web-Cam
Hand Gesture Recognition System for Human-Computer Interaction with Web-CamHand Gesture Recognition System for Human-Computer Interaction with Web-Cam
Hand Gesture Recognition System for Human-Computer Interaction with Web-Cam
ijsrd.com
 
Blue eye-bhupesh
Blue eye-bhupeshBlue eye-bhupesh
Blue eye-bhupesh
bhupesh lahare
 
A Survey on Detecting Hand Gesture
A Survey on Detecting Hand GestureA Survey on Detecting Hand Gesture
A Survey on Detecting Hand Gesture
IRJET Journal
 
Ppt
PptPpt
Gesture control algorithm for personal computers
Gesture control algorithm for personal computersGesture control algorithm for personal computers
Gesture control algorithm for personal computers
eSAT Publishing House
 
Gesture control algorithm for personal computers
Gesture control algorithm for personal computersGesture control algorithm for personal computers
Gesture control algorithm for personal computers
eSAT Journals
 
A computer vision based virtual mouse
A computer vision based virtual mouseA computer vision based virtual mouse
A computer vision based virtual mouse
StudentRocks
 
Gesture recognition document
Gesture recognition documentGesture recognition document
Gesture recognition document
Nikhil Jha
 
Blue Eyes
Blue EyesBlue Eyes
Blue Eyes
IpshitaNandy
 
Sign Language Identification based on Hand Gestures
Sign Language Identification based on Hand GesturesSign Language Identification based on Hand Gestures
Sign Language Identification based on Hand Gestures
IRJET Journal
 

Similar to ABSTRACT (20)

20922174-Blue-Eyes-Technology.pdf
20922174-Blue-Eyes-Technology.pdf20922174-Blue-Eyes-Technology.pdf
20922174-Blue-Eyes-Technology.pdf
 
Blue Eyes Technology
Blue Eyes TechnologyBlue Eyes Technology
Blue Eyes Technology
 
Blue Eyes Technology
Blue Eyes TechnologyBlue Eyes Technology
Blue Eyes Technology
 
Blue eyes
Blue eyesBlue eyes
Blue eyes
 
Blue eye technology
Blue eye technologyBlue eye technology
Blue eye technology
 
Blue eyes technology
Blue eyes technologyBlue eyes technology
Blue eyes technology
 
A Seminar Report On Blue Eyes Technology Submitted By
A Seminar Report On Blue Eyes Technology Submitted ByA Seminar Report On Blue Eyes Technology Submitted By
A Seminar Report On Blue Eyes Technology Submitted By
 
Blue eyes technology New Version 2017
Blue eyes technology New Version 2017Blue eyes technology New Version 2017
Blue eyes technology New Version 2017
 
Gesture Recognition
Gesture RecognitionGesture Recognition
Gesture Recognition
 
Cursor Movement with Eyeball
Cursor Movement with EyeballCursor Movement with Eyeball
Cursor Movement with Eyeball
 
Hand Gesture Recognition System for Human-Computer Interaction with Web-Cam
Hand Gesture Recognition System for Human-Computer Interaction with Web-CamHand Gesture Recognition System for Human-Computer Interaction with Web-Cam
Hand Gesture Recognition System for Human-Computer Interaction with Web-Cam
 
Blue eye-bhupesh
Blue eye-bhupeshBlue eye-bhupesh
Blue eye-bhupesh
 
A Survey on Detecting Hand Gesture
A Survey on Detecting Hand GestureA Survey on Detecting Hand Gesture
A Survey on Detecting Hand Gesture
 
Ppt
PptPpt
Ppt
 
Gesture control algorithm for personal computers
Gesture control algorithm for personal computersGesture control algorithm for personal computers
Gesture control algorithm for personal computers
 
Gesture control algorithm for personal computers
Gesture control algorithm for personal computersGesture control algorithm for personal computers
Gesture control algorithm for personal computers
 
A computer vision based virtual mouse
A computer vision based virtual mouseA computer vision based virtual mouse
A computer vision based virtual mouse
 
Gesture recognition document
Gesture recognition documentGesture recognition document
Gesture recognition document
 
Blue Eyes
Blue EyesBlue Eyes
Blue Eyes
 
Sign Language Identification based on Hand Gestures
Sign Language Identification based on Hand GesturesSign Language Identification based on Hand Gestures
Sign Language Identification based on Hand Gestures
 

ABSTRACT

  • 1. BLUE EYES TECHNOLOGY College: Vasireddy Venkatadri Institute Of Technology Dept: Computer Science Engineering S. Jaya Sindhura(12BQ1A0595) M. Sujana(12BQ1A0566) ABSTRACT Is it possible to create a computer, which can interact with us as we interact each other? For example imagine in a fine morning you walk on to your computer room and switch on your computer, and then it tells you “Hey friend, good morning you seem to be in a bad mood today. And then it Opens your mail box and shows you some of the mails and tries to cheer you. It seems to be a fiction, but it will be the life lead by “BLUE EYES” in the very near future. The basic idea behind this technology is to give the computer the human power. We all have some perceptual abilities. That is we can Understand each other’s feelings. For example we can understand one’s emotional state by analyzing his facial expression. If we add these perceptual abilities of human to computers would enable computers to work together with human beings as intimate partners. The “BLUE EYES” technology aims at creating computational machines that have perceptual and sensory ability like those of human beings. How can we make computers “see” and “feel”? BLUE EYES uses sensing technology to identify a user’s actions and to extract key information. For Example: In Future a BLUE EYES-enabled television could become active when the user makes eye contact at which point the user could then tell the television to “turn on”. This paper is about the software, benefits and interconnection of various parts involved in the “BLUE EYE” technology. INTRODUCTION Imagine yourself in a world where humans interact with computers. You are sitting in front of your personal computer that can listen, talk, or even scream aloud. It has the ability to gather information about you and interact with you through special techniques like facial recognition, speech recognition, etc. It can even understand your emotions at the touch of the mouse. It verifies your identity, feels your presents, and starts interacting with you. The BLUE EYES technology aims at creating computational machines that have perceptual and sensory ability like those of human beings. It uses non-obtrusive sensing method, employing most modern video cameras and microphones to identify the user’s actions through the use of imparted sensory abilities. The machine can understand what a user wants, where he is looking at, and even realize his physical or emotional states. The primary objective of the research is to give a computer the ability of the human being to asses a situation by using the senses of sight, hearing, and touch. The BLUEEYES project aims at creating computational devices with the sort of perceptual abilities that people take for granted. Thus BLUEEYES are the technology to make computers sense and understand human behavior and feelings and react in the proper ways. AIMS: 1) To design smarter devices 2) To create devices with emotional intelligence 3) To create computational devices with
  • 2. Perceptual abilities. TRACKS USED Our emotional changes are mostly reflected in our heart pulse rate, breathing rate, facial expressions, eye movements, voice etc. Hence these are the parameters on which blue technology is being developed. Making computers see and feel Blue Eyes uses sensing technology to identify a user's actions and to extract key information. This information is then analyzed to determine the users physical, emotional, or informational state, which in turn can be used to help make the user more productive by performing expected actions or by providing expected information TECHNOLOGIES USED The process of making emotional computers with sensing abilities is known as affective computing. The steps used in this are: 1) Giving sensing abilities. 2) Detecting human emotions. 3) Respond properly The first step, is to give machines the equivalent of the eyes, ears, and other sensory organs that humans use to recognize and express emotion. computer scientists are exploring a variety of mechanisms including voice-recognition software that can discern not only what is being said but the tone in which it is said; cameras that can track subtle facial expressions, eye movements, and hand gestures; and biometric sensors that can measure body temperature, blood pressure, muscle tension, and other physiological signals associated with emotion. In the second step, the computers have to detect even the minor variations of our moods. For e.g. person may hit the keyboard very fast either in the happy mood or in the angry mood. In the third step the computers have to react in accordance MJ with the emotional states. Various methods of accomplishing affective Computing 1) MAGIC POINTING 2) SUITOR 3) EMOTIONAL MOUSE 4)ARTIFICIAL INTELLIGENCE AND SPEECH RECOGNITION MAGIC POINTING: MAGIC stands for Manual Acquisition with Gaze Tracking Technology. a computer with this technology could move the cursor by following the direction of the user's eyes. This type of technology will enable the computer to automatically transmit information related to the screen that the user is gazing at. Also, it will enable the computer to determine, from the user's expression, if he or she understood the information on the screen, before automatically deciding to proceed to the next program. The user pointing is still done by the hand, but the cursor always appears at the right position as if by MAGIC. By varying input technology and eye tracking, we get MAGIC pointing. Gaze tracking has long been considered as an alternative or potentially superior pointing method for computer input. Two specific MAGIC pointing techniques, one conservative and one liberal, were designed, analyzed, and implemented with an eye tracker one has to be conscious of where one looks and how long one looks at an object. If one does not look at a target continuously for a set threshold (e.g., 200ms), the target will not be successfully selected . Once the cursor position had been redefined, the user would need to only make a small movement to, and click on, the target with a regular manual input device. We have designed two MAGIC pointing techniques, one liberal and the other conservative in terms of target identification and cursor placement. The liberal MAGIC pointing technique: cursor is placed in the vicinity of a target that the user fixates on. Actuate input device, observe the cursor position and decide in which direction to steer the cursor. The cost to this method is the increased manual movement amplitude.
  • 3. The conservative MAGIC pointing technique with "intelligent offset" To initiate a pointing trial, there are two strategies available to the user. One is to follow "virtual inertia:" move from tie cursor's current position towards the new target the user is looking at. This is likely the strategy the user will employ, due to the way the user interacts with today's interface. The alternative strategy, which may be more advantageous but takes time to learn, is to ignore the previous cursor position and make a motion which is most convenient and least effortful to the user for a given input device. The goal of the conservative MAGIC pointing method is the following. Once the user looks at a target and moves the input device, the cursor will appear "out of the blue" in motion towards the target, on the side of the target opposite to the initial actuation vector. In comparison to the liberal approach, this conservative approach has both pros and cons. While with this technique the cursor would never be over-active and jump to a place the user does not intend to acquire, it may require more hand-eye coordination effort. MAGIC pointing techniques offer the following potential advantages: 1.Reduction of manual stress and fatigue, since the cross screen long-distance cursor movement is eliminated from manual control. 2.Practical accuracy level. In comparison to traditional pure gaze pointing whose accuracy is fundamentally limited by the nature of eye movement, the MAGIC pointing techniques let the hand complete the pointing task, so they can be as accurate as any other manual input techniques. 3. A more natural mental model for the user. The user does not have to be aware of the role of the eye gaze. To the user, pointing continues to be a manual task, with a cursor conveniently appearing where it needs to be. 4. Speed. Since the need for large magnitude pointing operations is less than with pure manual cursor control, it is possible that MAGIC pointing will be faster than pure manual pointing. 5. Improved subjective speed and ease-of-use. Since the manual pointing amplitude is smaller, the user may perceive the MAGIC pointing system to operate faster and more pleasantly than pure manual control, even if it operates at the same speed or more slowly. ADVANTAGES in liberal conservative approach: 1) Reduction of manual stress and fatigue 2) Practical accuracy level 3) A more natural mental model for the user 4) Faster than pure manual pointing 5) Improved subjective speed and ease of use DISDVANTAGES in liberal conservative approach: 1)Liberal approach is distracting when the user is trying to read 2)The motor action computation cannot start until the cursor appears 3)In conservative approach, uncertainty of the exact location prolong the target acquisition time. EYE TRACKER The liberal MAGIC pointing technique: the curser is placed in the vicinity of the target that the user fixates on
  • 4. The conservative MAGIC pointing technique with “intelligent offset” Figure 4.3.Bright (left) and dark (right) pupil images resulting from on-axis and off-axis illumination. The glints, or corneal reflections, from the on- and off-axis light sources can be easily identified as the bright points in the iris. Eye tracking data can be acquired simultaneously with MRI scanning using a system that illuminates the left eye of a subject with an infrared (IR) source, acquires a video image of that eye, locates the corneal reflection (CR) of the IR source, and in real time calculates/displays/records the gaze direction and pupil diameter. Once the pupil has been detected, the corneal reflection is determined from the dark pupil image. The reflection is then used to estimate the user's point of gaze in terms of the screen coordinates where the user is looking at. An initial calibration procedure, similar to that required by commercial eye trackers. 2)SUITOR SUITOR stands for Simple User Interface Tracker. It implements the method for putting computational devices in touch with their users changing moods. By watching what we page the user is currently browsing, the SUITOR can find additional information on that topic. The key is that the user simply interacts with the computer as usual and the computer infers user interest based on what it sees the user do. 3)EMOTION MOUSE This is the mouse embedded with sensors that Can state the physiological attributes such as temperature, Body pressure, pulse rate, and touching style, etc. The computer can determine the user’s emotional states by a single touch. IBM is still Performing research on this mouse and will be available in the market within the next two or three years. The expected accuracy is 75%. One goal of human computer interaction (HCI) is to make an adaptive, smart computer system. In order to start creating smart computers, the computer must start gaining information about the user. One proposed method for gaining user information through touch is via a computer input device, the mouse. From the physiological data obtained from the user, an emotional state may be determined which would then be related to the task the user is currently doing on the computer. Over a period of time, a user model will be built in order to gain a sense of the user's personality.
  • 5. By matching a person’s emotional state and the context of the expressed emotion, over a period of time the person’s personality is being exhibited. Therefore, by giving the computer a longitudinal understanding of the emotional state of its user, the computer could adapt a working style which fits with its user’s personality. The result of this collaboration could increase productivity for the user. One way of gaining information from a user non-intrusively is by video. Cameras have been used to detect a person’s emotional state. We have explored gaining information through touch. One obvious place to put sensors is on the mouse. EXPERIMENT Based on Paul Elman’s facial expression work, we see a correlation between a person’s emotional state and a person’s physiological measurements. Selected works from Elman and others on measuring facial behaviors describe Elman’s Facial Action Coding System (Elman and Rosenberg, 1997). One of his experiments involved participants attached to devices to record certain measurements including pulse, galvanic skin response (GSR), temperature, somatic movement and blood pressure. He then recorded the measurements as the participants were instructed to mimic facial expressions which corresponded to the six basic emotions. He defined the six basic emotions as anger, fear, sadness, disgust, joy and surprise. From this work, Dryer (1993) determined how physiological measures could be used to distinguish various emotional states. The measures taken were GSR, heart rate, skin temperature and general somatic activity (GSA). These data were then subject to two analyses. For the first analysis, a multidimensional scaling. (MDS) procedure was used to determine the dimensionality of the data. RESULT The data for each subject consisted of scores for four physiological assessments [GSA, GSR, pulse, and skin temperature, for each of the six emotions (anger, disgust, fear, happiness, sadness, and surprise)] across the five minute baseline and test sessions. GSA data was sampled 80 times per second, GSR and temperature were reported approximately 3- 4times per second and pulse was recorded as a beat was detected, approximately 1 time per second 4)ARTIFICIAL INTELLIGENCE Artificial intelligence (Al) involves two basic ideas. First, it involves studying the thought processes of human beings. Second, it deals with representing those processes via machines (like computers, robots, etc). Al is behavior of a machine, which, if performed by a human being, would be called intelligent. It makes machines smarter and more useful, and is less expensive than natural intelligence. Natural language processing (NLP) refers to artificial intelligence methods of communicating with a computer in a natural language like English. The main objective of a NLP program is to understand input and initiate action. The input words are scanned and matched against
  • 6. internally stored known words. Identification of a key word causes some action to be taken. In this way, one can communicate with the computer in one's language. No special commands or computer language are required. There is no need to enter programs in a special language for creating software. SPEECH RECOGNITION: The user speaks to the computer through a microphone, which, in used; a simple system may contain a minimum of three filters. The more the number of filters used, the higher the probability of accurate recognition. The filter output is then fed to the ADC to translate the analogue signal into digital word. The ADC samples the filter outputs many times a second. Each sample represents different amplitude of the signal .The spoken words are processed by the filters and ADCs. The binary representation of each of these words becomes a template or standard, against which the future words are compared. These templates are stored in the memory. Once the storing process is completed, the system can go into its active mode and is capable of identifying spoken words. As each word is spoken, it is converted into binary equivalent and stored in RAM. The computer then starts searching and compares the binary input pattern with the templates, t is to be noted that even if the same speaker talks the same text, there are always slight variations in amplitude or loudness of the signal, pitch, frequency difference, time gap, etc. Due to this reason, there is never a perfect match between the template and binary input word. The pattern matching process therefore uses statistical techniques and is designed to look for the best fit. The values of binary input words are subtracted from the corresponding values in the templates. If both the values are same, the difference is zero and there is perfect match. If not, the subtraction produces some difference or error. The smaller the error, the better the match. When the best match occurs, the word is identified and displayed on the screen. BLUE EYES HARDWARE DATA ACQUISITION UNIT (DAU): 1) Main task is to fetch the physiological data 2) Sensor will send data to the central system to be processed. 3) ID cards and PIN codes provide operator's authorization. JAZZ MULTI SENSOR: 1) It supplies Raw digital data regarding eye position, level of blood oxygenation. 2) Eye movement is measured using direct infrared transducers. CENTRAL SYSTEM UNIT 1) The box contains a Bluetooth module and PCM codec for voice data transmission 2) Unit maintains the other side of the blue tooth connection, buffers incoming sensor data, performs on-line data analysis. CONNECTION MANAGER: The Connection Manager Handles: 1) Communication with the CSU hardware
  • 7. 2) Searching for new devices in the covered range. 3) Establishing Bluetooth connections 4) Connection authentication 5) Incoming data buffering 6) Sending alerts DATA ANALYSIS MODULE: Performs the analysis of the raw sensor data, in order to obtain information about the physiological condition. DATA LOGGER MODULE: 1)The raw or processed physiological data,alerts and operator’s voice are stored. 2) A Voice Data Acquisition module delivers the voice data. VISUALIZATION MODULE: 1)Enables them to watch each of the working operator’s physiological condition along with a preview of selected video source and related sound stream. 2)The Visualization module can be set in an off- line mode, where all the data is fetched from the database. BLUETOOTH technology provides means for creating personal area network linking DAU and central system unit. BLUE EYES enabled devices: POD: The first blue Eye enabled mass production device was POD, the car manufactured y TOYOTA. It could keep the driver alert and active. It could tell the driver to go slow if he is driving too fast and it could pull over the driver when he feels drowsy. Also it could hear the driver some sort of interesting music when he is getting bored. PONG: IBM released a robot designed for demonstrating the new technology. The Blue Eyes robot is equipped with a computer capable
  • 8. of analyzing a person's glances and other forms of expressions of feelings, before automatically determining the next type of action. IBM has released a robot called PONG, which is equipped with the Blue Eyes technology. PONG is capable of perceiving the person standing in front of it, smiles when the person calls his name, and expresses loneliness when it loses sight of the person. APPLICATIONS GENERIC CONTROL ROOMS: Power stations Flight control centers AUTOMOBILE INDUSTRY: The user can concentrate on observation and manual operations, and still control the machinery by voice input commands. FLIGHT CONTROL CENTERS: With reliable speech recognition equipment, pilots can give commands and information to the computers by simply speaking into their microphones—they don’t have to use their hands for this purpose. AIRFORCE AND MILITARY: To control weapons by voice commands ADVANTAGES 1) Faster than Pure manual pointing 2) Improved subjective speed and case of use 3) Practical accuracy level 4) SUITOR Computers would have been much more powerful, had they gained perceptual and sensory abilities of the living beings on the earth. What needs to be developed is an intimate relationship between the computer and the humans. And the Simple User Interest Tracker (SUITOR) is a revolutionary approach in this direction. A car equipped with an affective computing system could recognize when a driver is feeling drowsy and advise her to pull over, or it might sense when a stressed-out motorist is about to explode and warn him to slow down and cool off. A computer endowed with emotional intelligence, on the other hand, could recognize when its operator is feeling angry or frustrated and try to respond in an appropriate fashion. Such a computer might slow down or replay a tutorial program for a confused student, or recognize when a designer is frustrated or vexed and suggest he take a break. DISADVANTAGES 1) Liberal approach is Distracting when the user is trying to read. 2) The motor action Computation cannot start until the cursor appears. FUTURE ENHANCEMENTS 1) In the future, ordinary household devices- such as televisions, Ovens may be able to do their jobs when we look at them and speak to them. 2) Future applications of blue eye technology is limitless CONCLUSION 1) Provide more delicate and user friendly facilities in computing devices 2) Gap between the electronic and physical world is reduced 3) The computers can be run using implicit commands instead of the explicit commands