Mardanbegi, D., and Hansen, D.W. “Eye-based head gestures for interaction in the car” In Proceedings of the 2012 ACM Conference on Automotive User Interfaces and Interactive Vehicular Applications(AutomotiveUI '12). ACM, Portsmouth, NH, USA, 2012.
My name is diako mardanbegi,\nI am a phd student at itu working on developing and imroving the head-mounted eyr trackers.\n
Car user interfaces have become more complex with much new functionalities. \ndriver's focus on driving, still hase the highest priority And our main concern is to make the tasks that are not related to controlling the vehicle as minimally distracting as possible.\nSo we need to make the interaction methods more simple and intuitive. \n\n\n\n
New interaction techniques like speech and gesture recognition seem to be intuitive way for interaction as they can use the natural human communication skills\n\n1-There have been some studies that report that voice-activated interfaces impose inappropriately high cognitive loads and can negatively affect driving performance. And the main reason is that we are still far from achieving high-performance automatic speech recognition (ASR) systems. \n2-There are also some tasks like controlling radio volume, opening the window just slightly, continuously zoom or scrolling the map which are not intuitive operations to perform solely via speech-based interaction. \n3-Speech input cannot also be used when the environment is too noisy. \n\n\n\n
Interaction by head gestures involves less driver's cognitive load. \n
many methods for head gesture recognition have been used that They are mostly concentrated on only detecting head shakes and nods.\n\nhead gesture recognition just like speech ecog. often require a short explicit command like pushing a button before they can be used.\n \nmost of them are video based\n
On the other hand, video based eye trackers have been used in cars for different proposes and they usually require another camera system. \n\nEye trackers are used as a fatigue monitoring device to improve driving safety. \nEye and the visual behaviors measured by an eye tracker provide significant information about driver's attention.\nThe texture of the iris is known to be a powerful biometric for human identification\n\n\n\n
In this paper we suggest to use a method called eye based head gestures for interaction with the user interfaces. This method allows us to use driver's visual attention and head gesture together for interaction\n
Eye based head gesture is a method for mesuring the head movements through eye movements. \n
When our gaze point is fixed... Figure...\nThis method allows for measuring a wide range of head movements even though the movements are very small \n
So the idea is ...,figure\nSo a simple head gesture may have different interpretation depends on which object you are looking at.\n \n
\n\n
But can we use this method for interction inside the car\n\n we minimize the time that the gaze is away from the roadway by transferring the fixed-gaze target from a point on the object to a specified point on the windscreen. \n\nfor short gestures no need for target\nmeaningful gesture\ngazing on the target can be easily detected\n
flipping the rear-view mirror down or up or accepting the incoming phone calls or in general any kind of yes/no decision when a system initiated questions or option dialogs\n
As an intuitive shortcut function, enabling the user to control the music player and to skip between individual cd-tracks or radio stations. \n
This method has a hight,, potential to be used for interaction with HUD\nenabling the driver to do selecting and for switching between different submenus \n
Continuous vertical movements of the head can be useful for changing the volume, adjusting the air conditioning temperature, opening and closing the window, and continuously zoom or scrolling the map. In these examples, visual or auditory feedback through HUD or speakers can help the driver to perform the task more efficiently. \n
The visual feedback can be a highlight color or even displaying the image of the object. For example, when the driver wants to adjust the side-view mirrors, he/she looks at the mirror and then the eye tracker recognize the mirror as the attended object and then the system shows the real-time image of the mirror in the head-up display. Now, the driver can see the mirror image in front of the windscreen and therefore can easily adjust the mirror by the head movements. \n