EyePhone is a system that allows users to control their mobile phone using only their eyes. It tracks the user's eye movement across the phone's display using the front-facing camera and detects eye blinks to select items. EyePhone works in four phases: 1) eye detection using motion analysis to find eye contours, 2) creation of an open eye template, 3) eye tracking via template matching, and 4) blink detection using thresholding. The system was evaluated for accuracy in different lighting conditions and while walking. Potential applications include an eye-based menu and driver safety monitoring. Future work could improve performance under varying conditions and use learning instead of fixed thresholds.
As smartphones evolve researchers are studying new techniques to ease the human-mobile interaction. We propose EyePhone, a novel “hand-free” interfacing system capable of driving mobile applications/functions using only the user’s eyes movement and actions (e.g., wink). EyePhone tracks the user’s eye movement across the phone’s display using the camera mounted on the front of the phone; more specifically, machine learning algorithms are used to: i) track the eye and infer its position on the mobile phone display as a user views a particular application; and ii) detect eye blinks that emulate mouse clicks to activate the target application under view. We present a prototype implementation of EyePhone on a Nokia N810, which is capable of tracking the position of the eye on the display, mapping this positions to an application that is activated by a wink. At no time does the user have to physically touch the phone display.
As smartphones evolve researchers are studying new techniques to ease the human-mobile interaction. We propose EyePhone, a novel “hand-free” interfacing system capable of driving mobile applications/functions using only the user’s eyes movement and actions (e.g., wink). EyePhone tracks the user’s eye movement across the phone’s display using the camera mounted on the front of the phone; more specifically, machine learning algorithms are used to: i) track the eye and infer its position on the mobile phone display as a user views a particular application; and ii) detect eye blinks that emulate mouse clicks to activate the target application under view. We present a prototype implementation of EyePhone on a Nokia N810, which is capable of tracking the position of the eye on the display, mapping this positions to an application that is activated by a wink. At no time does the user have to physically touch the phone display.
Microsoft Hololens is the technology that combines the VR with the real world. The company claims that this so-called computer over the head, HoloLens can process TBs of data per second which is insanely huge number. This technology has a lot many application which can not be explained simply as such.
Now, the time is not very far when the world will be more like the sci-fi movie.
Blue Eyes is a technology conducted by the research team of IBM at its Almaden Research Center (ARC) in San Jose, California since 1997. Blue eyes technology makes a computer to understand and sense human feelings and behavior and also enables the computer to react according to the sensed emotional levels. The aim of the blue eyes technology is to give human power or abilities to a computer, so that the machine can naturally interact with human beings as we interact with each other. All human beings have some perceptual capabilities, the ability to understand each other’s emotional level or feelings from their facial expressions. Blue eyes technology aims at creating a computer that have the abilities to understand the perceptual powers of human being by recognizing their facial expressions and react accordingly to them.
Imagine, a beautiful world, where humans collaborate with computers!! .The computer can talk, listen or screech aloud!! .With the help of speech recognition and facial recognition systems, computers gathers information from the users and starts interacting with them according to their mood variations. Computer recognizes your emotional levels by a simple touch on the mouse and it can interact with us as an intimate partner. The machine feels your presence; verifies your identity and starts interacting with you and even it will dial and call to your home at any urgent situations. This all is happening with this “Blue Eyes” technology.
Eye Movement based Human Computer Interaction TechniqueJobin George
Eye movement-based interaction is one of several areas of current research in human computer interaction in which a new interface style seems to be emerging. In the non-command style, the computer passively monitors the user and responds as appropriate, rather than waiting for the user to issue specific commands. In describing eye movement-based human-computer interaction we can see two distinctions, one is in the nature of the user’s eye movements and the other, in the nature of the responses. In the world created by an eye movement based interface, users could move their eyes to scan the scene, just as they would a real world scene, unaffected by the presence of eye tracking equipment movement, on the eye movement axis. The alternative is to instruct users of the eye movement based interface to move their eyes in particular ways. On the response axis, objects could respond to a user’s eye movements in a natural way that is, the object responds to the user’s looking in the same way real objects do. The alternative is unnatural response, where objects respond in ways not experienced in the real world.
Now a days Eye tracking technology is applied in many fields like automotive defense and medical industries. The fields of advertising, entertainment, packaging and web design have all benefited significantly from studying the visual behavior of the consumer. Every day, as eye tracking is used in creative new ways, the list of applications grows.
The Microsoft company have developed Skinput, a technology that appropriates the human body for acoustic transmission, allowing the skin to be used as an input surface. In particular, we resolve the location of finger taps on the arm and hand by analyzing mechanical vibrations that propagate through the body. We collect these signals using a novel array of sensors worn as an armband. This approach provides an always available, naturally portable, and on-body finger input system. We assess the capabilities, accuracy and limitations of our technique through a two-part, twenty-participant user study. To further illustrate the utility of our approach, we conclude with several proof-of-concept applications we developed.
Microsoft Hololens is the technology that combines the VR with the real world. The company claims that this so-called computer over the head, HoloLens can process TBs of data per second which is insanely huge number. This technology has a lot many application which can not be explained simply as such.
Now, the time is not very far when the world will be more like the sci-fi movie.
Blue Eyes is a technology conducted by the research team of IBM at its Almaden Research Center (ARC) in San Jose, California since 1997. Blue eyes technology makes a computer to understand and sense human feelings and behavior and also enables the computer to react according to the sensed emotional levels. The aim of the blue eyes technology is to give human power or abilities to a computer, so that the machine can naturally interact with human beings as we interact with each other. All human beings have some perceptual capabilities, the ability to understand each other’s emotional level or feelings from their facial expressions. Blue eyes technology aims at creating a computer that have the abilities to understand the perceptual powers of human being by recognizing their facial expressions and react accordingly to them.
Imagine, a beautiful world, where humans collaborate with computers!! .The computer can talk, listen or screech aloud!! .With the help of speech recognition and facial recognition systems, computers gathers information from the users and starts interacting with them according to their mood variations. Computer recognizes your emotional levels by a simple touch on the mouse and it can interact with us as an intimate partner. The machine feels your presence; verifies your identity and starts interacting with you and even it will dial and call to your home at any urgent situations. This all is happening with this “Blue Eyes” technology.
Eye Movement based Human Computer Interaction TechniqueJobin George
Eye movement-based interaction is one of several areas of current research in human computer interaction in which a new interface style seems to be emerging. In the non-command style, the computer passively monitors the user and responds as appropriate, rather than waiting for the user to issue specific commands. In describing eye movement-based human-computer interaction we can see two distinctions, one is in the nature of the user’s eye movements and the other, in the nature of the responses. In the world created by an eye movement based interface, users could move their eyes to scan the scene, just as they would a real world scene, unaffected by the presence of eye tracking equipment movement, on the eye movement axis. The alternative is to instruct users of the eye movement based interface to move their eyes in particular ways. On the response axis, objects could respond to a user’s eye movements in a natural way that is, the object responds to the user’s looking in the same way real objects do. The alternative is unnatural response, where objects respond in ways not experienced in the real world.
Now a days Eye tracking technology is applied in many fields like automotive defense and medical industries. The fields of advertising, entertainment, packaging and web design have all benefited significantly from studying the visual behavior of the consumer. Every day, as eye tracking is used in creative new ways, the list of applications grows.
The Microsoft company have developed Skinput, a technology that appropriates the human body for acoustic transmission, allowing the skin to be used as an input surface. In particular, we resolve the location of finger taps on the arm and hand by analyzing mechanical vibrations that propagate through the body. We collect these signals using a novel array of sensors worn as an armband. This approach provides an always available, naturally portable, and on-body finger input system. We assess the capabilities, accuracy and limitations of our technique through a two-part, twenty-participant user study. To further illustrate the utility of our approach, we conclude with several proof-of-concept applications we developed.
An idea of intuitive mobile diopter calculator for myopia patientTELKOMNIKA JOURNAL
The diopter is the unit of measurement for the refractive power of a lens. Myopia is a form of refractive error which is a leading cause of visual disability throughout the world. The prevailing treatment of refractive errors which are commonly used in daily life are glasses and contact lenses. Although those methods can overcome myopia, many myopia patients still don’t really know how to measure their current refractive error in diopter. This condition may retard the progression of refractive treatment in the myopic individual. The common methods to measure refractive error are phoropter with Snellen chart and retinoscopy, but those expensive tools need expertise to operate. This paper presents the concept of measure the face to smartphone screen distance to provide the possibility to implement a mobile application as a low-cost alternative refractive measurement tool. The main objective is to investigate the feasibility of mobile application to help patients with myopia measuring their blur line distances and evaluate their diopter levels independently. The experimental results reveal that, with 80.5 usability score the overall functionality of proposed application can be categorized as usable to users and feasible for future implementation.
Survey on Human Computer interaction for disabled persons Muhammad Bilal
At present, there are many solutions which facilitate the interaction of computers to handicapped people. In recent years there has been an increased interest in HumanComputer Interaction Systems allowing for more natural communication with machines. Such systems are especially important for elderly and disabled persons. In this paper we have conducted a survey of ten different techniques and tried to analyze how these techniques are working. The techniques have also been compared on the basis of many features which give an insight into the effectiveness of these techniques
You can contact me on Gmai
bilal.professional786@gmail.com
You can also contact me on Facebook
https://www.facebook.com/bilalbakhtawar
In the simplest terms, eye tracking is the measurement of eye activity. Where do we look? When do we blink? How does the pupil react to different stimuli? The concept is basic, but the process and interpretation can be quite complex. There are many different methods of exploring eye data.The most common is to analyze the visual path of one or more participants across an interface such as a computer screen. Each eye data observation is translated into a set of pixel coordinates.
Eye Tracking Accuracy Comparison Test
SMI RED 250 vs.GazeFlow WebCam EyeTracker
Download Demo!
GazeRecorder: WebCam Eye Tracking for usability testing :
https://gazerecorder.com
GazePointer:
Control mouse cursor position with your eyes via webcam.
http://www.sourceforge.net/projects/gazepointer
GazeInsight Cloud eye tracking online
https://app.gazerecorder.com
GazeCloud API
https://api.gazerecorder.com
Final project report on grocery store management system..pdfKamal Acharya
In today’s fast-changing business environment, it’s extremely important to be able to respond to client needs in the most effective and timely manner. If your customers wish to see your business online and have instant access to your products or services.
Online Grocery Store is an e-commerce website, which retails various grocery products. This project allows viewing various products available enables registered users to purchase desired products instantly using Paytm, UPI payment processor (Instant Pay) and also can place order by using Cash on Delivery (Pay Later) option. This project provides an easy access to Administrators and Managers to view orders placed using Pay Later and Instant Pay options.
In order to develop an e-commerce website, a number of Technologies must be studied and understood. These include multi-tiered architecture, server and client-side scripting techniques, implementation technologies, programming language (such as PHP, HTML, CSS, JavaScript) and MySQL relational databases. This is a project with the objective to develop a basic website where a consumer is provided with a shopping cart website and also to know about the technologies used to develop such a website.
This document will discuss each of the underlying technologies to create and implement an e- commerce website.
Welcome to WIPAC Monthly the magazine brought to you by the LinkedIn Group Water Industry Process Automation & Control.
In this month's edition, along with this month's industry news to celebrate the 13 years since the group was created we have articles including
A case study of the used of Advanced Process Control at the Wastewater Treatment works at Lleida in Spain
A look back on an article on smart wastewater networks in order to see how the industry has measured up in the interim around the adoption of Digital Transformation in the Water Industry.
TECHNICAL TRAINING MANUAL GENERAL FAMILIARIZATION COURSEDuvanRamosGarzon1
AIRCRAFT GENERAL
The Single Aisle is the most advanced family aircraft in service today, with fly-by-wire flight controls.
The A318, A319, A320 and A321 are twin-engine subsonic medium range aircraft.
The family offers a choice of engines
COLLEGE BUS MANAGEMENT SYSTEM PROJECT REPORT.pdfKamal Acharya
The College Bus Management system is completely developed by Visual Basic .NET Version. The application is connect with most secured database language MS SQL Server. The application is develop by using best combination of front-end and back-end languages. The application is totally design like flat user interface. This flat user interface is more attractive user interface in 2017. The application is gives more important to the system functionality. The application is to manage the student’s details, driver’s details, bus details, bus route details, bus fees details and more. The application has only one unit for admin. The admin can manage the entire application. The admin can login into the application by using username and password of the admin. The application is develop for big and small colleges. It is more user friendly for non-computer person. Even they can easily learn how to manage the application within hours. The application is more secure by the admin. The system will give an effective output for the VB.Net and SQL Server given as input to the system. The compiled java program given as input to the system, after scanning the program will generate different reports. The application generates the report for users. The admin can view and download the report of the data. The application deliver the excel format reports. Because, excel formatted reports is very easy to understand the income and expense of the college bus. This application is mainly develop for windows operating system users. In 2017, 73% of people enterprises are using windows operating system. So the application will easily install for all the windows operating system users. The application-developed size is very low. The application consumes very low space in disk. Therefore, the user can allocate very minimum local disk space for this application.
About
Indigenized remote control interface card suitable for MAFI system CCR equipment. Compatible for IDM8000 CCR. Backplane mounted serial and TCP/Ethernet communication module for CCR remote access. IDM 8000 CCR remote control on serial and TCP protocol.
• Remote control: Parallel or serial interface.
• Compatible with MAFI CCR system.
• Compatible with IDM8000 CCR.
• Compatible with Backplane mount serial communication.
• Compatible with commercial and Defence aviation CCR system.
• Remote control system for accessing CCR and allied system over serial or TCP.
• Indigenized local Support/presence in India.
• Easy in configuration using DIP switches.
Technical Specifications
Indigenized remote control interface card suitable for MAFI system CCR equipment. Compatible for IDM8000 CCR. Backplane mounted serial and TCP/Ethernet communication module for CCR remote access. IDM 8000 CCR remote control on serial and TCP protocol.
Key Features
Indigenized remote control interface card suitable for MAFI system CCR equipment. Compatible for IDM8000 CCR. Backplane mounted serial and TCP/Ethernet communication module for CCR remote access. IDM 8000 CCR remote control on serial and TCP protocol.
• Remote control: Parallel or serial interface
• Compatible with MAFI CCR system
• Copatiable with IDM8000 CCR
• Compatible with Backplane mount serial communication.
• Compatible with commercial and Defence aviation CCR system.
• Remote control system for accessing CCR and allied system over serial or TCP.
• Indigenized local Support/presence in India.
Application
• Remote control: Parallel or serial interface.
• Compatible with MAFI CCR system.
• Compatible with IDM8000 CCR.
• Compatible with Backplane mount serial communication.
• Compatible with commercial and Defence aviation CCR system.
• Remote control system for accessing CCR and allied system over serial or TCP.
• Indigenized local Support/presence in India.
• Easy in configuration using DIP switches.
Student information management system project report ii.pdfKamal Acharya
Our project explains about the student management. This project mainly explains the various actions related to student details. This project shows some ease in adding, editing and deleting the student details. It also provides a less time consuming process for viewing, adding, editing and deleting the marks of the students.
Overview of the fundamental roles in Hydropower generation and the components involved in wider Electrical Engineering.
This paper presents the design and construction of hydroelectric dams from the hydrologist’s survey of the valley before construction, all aspects and involved disciplines, fluid dynamics, structural engineering, generation and mains frequency regulation to the very transmission of power through the network in the United Kingdom.
Author: Robbie Edward Sayers
Collaborators and co editors: Charlie Sims and Connor Healey.
(C) 2024 Robbie E. Sayers
2. Content
Technology introduction
HPI Evolution
Terminologies
Paper Abstract
Eye Phone Design
Evaluation
Application
Demo
Future work
Improvement on Paper
Related work
3. Technology Introduction
Human Computer Interaction(HCI)
“HCI (human-computer interaction) is the study of how
people interact with computers and to what extent
computers are or are not developed for successful
interaction with human beings”
Most HCI technology addresses the interaction between
people and computers in “ideal” environments, i.e.,
where people sit in front of a desktop machine with
specialized sensors and cameras centered on them
4. Technology Introduction
Human Phone Interaction(HI)
“Human-Computer Interaction (HCI) researchers and phone vendors
are continuously searching for new approaches to reduce the effort users
exert when accessing applications on limited form factor devices such as
mobile phones.”
Human-phone interaction (HPI) extends the challenges not typically found in
HCI research, more specially related to the phone and how we use it. In
order to address these goals HPI technology should be less intrusive; that
is,
i) it should not rely on any external devices other than the mobile phone
itself;
ii) it should be readily usable with minimum user dependency as possible;
iii) it should be fast in the inference phase;
iv) it should be lightweight in terms of computation;
v) it should preserve the phone user experience, e.g., it should not deplete
the phone battery over normal operations.
9. HPI Evolution –
What Next ?
Eye Phone – Controlling Phone by
Eyes
EyePhone tracks the user’s eye movement across the phone’s display
using the camera mounted on the front of the phone.
i) Track the eye and infer its position on the mobile phone
display as a user views a particular application
i) Detect eye blinks that emulate mouse clicks to activate the
target application under view.
10. Terminologies
– Intrusive Method:
– which requires direct contact with the eyes
– Non- Intrusive methods:
– which avoid any physical contact with the
user
– Model based method
– Appearance based method
– Feature based method
11. EyePhone Design
1) An eye detection phase
2) An open eye template creation
phase
3) An eye tracking phase
4) A blink detection phase
12. Eye detection phase
By applying a motion analysis technique which operates on
consecutive frames, this phase consists on finding the
contour of the eyes. The eye pair is identified by the left and
right eye contours.
Results of original algorithm
running on a desk-top with a
USB camera
Ref: Eye Tracking and Blink
Detection
Library.(http://tinyurl.com/yb9v
Results of EyePhone running on a
Nokia N810.
The smaller dots are erroneously
interpreted as eye contours
13. Based on previous experimental observations, they modify the
original algorithm
i) Reducing the image resolution, which reduces the eye
detection error rate
ii) Adding two more criteria to the original heuristics that filter
out the false eye contours
Eye detection
14. Open eye template creation
EyePhone creates a template of a user’s open eye
once at the beginning when a person uses the
system for the first time using the eye detection
algorithm. The template is saved in the
persistent memory of the device and fetched
when EyePhone is invoked
Downside of this off-line template creation approach is
that a template created in certain lighting conditions
might not be perfectly suitable for other environments
15. Eye tracking
The eye tracking algorithm is based on
template matching. The template matching
function calculates a correlation score
between the open eye template, created the
first time the application is used, and a
search window.
16. Blink Detection
To detect blinks they apply a thresholding
technique for the normalized correlation
coefficient returned by the template matching
function
Problem: Quality of the mobile camera is not the same as a good USB camera, and the phone’s camera is
generally closer to the person’s face than is the case of using a desktop and USB camera. Because of this
latter situation the camera can pick up iris movements, i.e., the interior of the eye, due to eyeball rotation.
In particular, when the iris is turned towards the corner of the eye, upwards or downwards, a blink is
inferred even if the eye remains open. This occurs because in this case the majority of the eye ball surface
turns white which is confused with the color of the skin
17. EVALUATION
DS = eye tracking accuracy measured in daylight exposure
and being steady;
AS =eye tracking accuracy measured in artificial light
exposure and being steady;
DM = eye tracking accuracy measured in daylight exposure
and walking;
BD = blink detection accuracy in daylight exposure
18. EVALUATION
Average CPU usage, RAM usage, and computation time for
one video frame. The front camera supports up to 15
frames per second. The last column reports the percentage
of used battery by EyePhone after a three hour run of the
system
19. Applications
• EyeMenu
• Car Driver Safety
EyePhone could also be used to detect drivers
drowsiness and distraction in cars.
20. FUTURE WORK
• Creation of the open eye template and the
filtering algorithm for wrong eye contours
• Improvement over variations of lighting
conditions or movement of the phone in a
person’s hand
• Using a learning approach instead of a fixed
thresholding policy