1. School of Science and Technology
MSc/MA Creative Technology
MDA4605
Final Project Report
«iWizard: Using Eye Tracking Technology to Control the Physical Environment»
Tutor(s): Magnus Moar
Student Name: Maryna Razakhatskaya
Student ID Number: M00548643
Date: 10 October 2016
Project Video: https://vimeo.com/186337188
3. Student Name: Maryna Razakhatskaya Student Number: M00548643
3
Introduction
iWizard is a creative technology project that explores application of eye-tracking technology to
control physical interfaces remotely. iWizard is a hybrid hardware-software product that consists of:
• theEyeTribe eye-tracking device,
• two versions of physical interfaces hand-crafted and operated by Arduino microcontrollers,
• custom software developed in Processing 3.0 that receives data from the eye-tracker and
interprets and transmits it to microcontrollers.
iWizard is the first known experiment of using eye-tracking to actively control physical environment.
Previous applications of active eye-tracking technology required direct user interaction with a
computer / digital environment and have mostly been used as assistive technology for people with
disabilities. Nowadays, emerging technology demonstrates another use case for active eye-tracking -
ivirtual reality. Cases where digital screen or VR headset is not required relate to passive interactions
when users provide visual input but do not receive any feedback. Passive eye-tracking has proven
itself as an effective tool of visual behaviour study and is widely applied in marketing / educational /
usability research.
Eye-tracking methodology exists since 1800s and was initially based on direct observations of eye
movement in the process of reading. With the rise of computer technology in 1980s, eye-tracking
found a new niche in human-computer interaction studies. Eye-tracking devices remained costly
experimental lab-only technology up to 2014 when two low-cost eye-tracking sensors were
introduced to the market by companies Tobii and EyeTribe.
Since 2014, eye-tracking is an affordable technology with high potential to explore. [3]
The entire interest to eye-tracking technology is explained by the next shift in emerging technology
when digital transformation stands down for “phygital” transformation - the merge of physical and
digital environments. The early attempts to connect physical to digital used QR-codes, NFC,
ibeacons, etc. New major technological trends in this domain like Internet of Things and mixed
reality have set new requirements for methods and tools of human-computer interactions. In this
context, eye-trackers get into the same category as Leap Motion, Microsoft Kinect, Siri, etc. being a
data input alternative to keyboard/mouse/touchscreen. In certain environments like virtual reality
and sometimes physical spaces, the use of buttons or touchscreens might not be possible and will
require smooth replacement with contactless systems controlled by motion, voice, eye gaze or any
other input that can be captured by sensors.
Cultural context also proves people’s will to use eyes to control things. Most languages contain
idioms and metaphors in which vision becomes an active tool:
“wither with a look” | “if looks could kill”
“his eyes bored holes in me”
Moving objects with eyes have always been considered a superpower or an extrasensory magical
ability repeatedly mentioned in fairy tales, legends, and science fiction books. All these represent a
hidden strive of humanity to empower eyes to perform actions.
Taking into account technological and cultural context as well as the technical feasibility of today’s
eye-tracking, iWizard project explores and designs new relevant human-computer interaction
models.
The report will begin with an technical review of existing eye-tracker-enabled projects in different
domains. It will contain analysis of strengths and weaknesses of existing projects and will define the
opportunity scope for iWizard project. Second part of the report will describe the final product and
its design workflow. Experiments and practical findings are documented in this chapter. The
6. Student Name: Maryna Razakhatskaya Student Number: M00548643
6
technology by 2016. Table 2 provides a detailed comparison of Tobii and EyeTribe devices based on
technical specifications.
Table 2. - Comparison of EyeTribe and Tobii Sensors Based on Tech Specifications:
EyeTribe Tracker Pro Tobii EyeX Controller
Latency < 16 ms 15+/-5ms
Operating range 45cm – 75cm 45-100 cm
Tracking area 50cm x 30cm
at 65cm distance
45-100 cm
Screen sizes Up to 27” Up to 27”
API/SDK Java, C++, C#, Unity, Processing UE4, C, C++, .NET, Unity
Data output Binocular gaze data (x/y screen
coordinate)
3D eye position
Pupil diameter in mm
Gaze point,
Eye positon,
Fixations.
Dimensions (W/H/D) 22x15x220 mm 20x15x230 mm
Weight <100 grams 0.2 lb / 91 grams
Interface USB3.0 USB3.0
Operating System Windows 10, Windows 8, Windows 7,
MacOS,
Android
Windows 10,
Windows 8.1,
Windows 7
Sources: [19, 23].
Table 2 displays relative similarity of the EyeTribe and Tobii EyeX devices based on the majority of
technical specifications. For developers building applications for Mac and Android devices as well as
for users of these devices, EyeTribe will be preferable and only version.
After a search of custom third-party SDKs and libraries, it turned out that EyeTribe device can be also
used with Processing environment due to «Eye Tribe for Processing» library developed and
published by Jorge C. S. Cardoso. The library currently provides functions to get the gaze point, the
eye coordinates, and to allow calibrating the device within the Processing sketch. [5]
Dimensions of both devices are quite small and make those handy to use. Operating range and
screen size limitations – adversely – set certain limitations to the use of eye trackers with bigger
screens, physical interfaces, and at a longer distance.
When building products with remote eye trackers it is critical to consider the need for calibration
that remore eye trackers require to work with accuracy. The process usually means following the
dots on a screen, takes up to two minutes, and usually becomes a barrier to design smooth
interactions with eye tracking.
7. Student Name: Maryna Razakhatskaya Student Number: M00548643
7
Eye Tracking: Existing Use Cases by Domain
Eye tracking technology application is defined by the nature and behavior of human eyes. Eyes move
at a very high speed, eyes can focus, eyes can be opened/ closed/ blinking. [24] In regards to
practical application there are three main reasons when eye tracking becomes an appropriate
solution:
• to collect visual behavior data for marketing, educational, UX research,
• to control digital and physical interfaces when hands are not available,
• to add an unexpected interesting element to interactive art installations.
These three main reasons let all existing eye-tracker-enabled projects to be classified into three
categoriess: research, assistive technology, art installations. Below is the review of projects of each
category.
Marketing Research
The Focus Project
The Focus Project is a research project that involves installation of a non-intrusive eye
tracking device (Tobii EyeX Controller) that will record what users look at whilst they browse
the internet on a daily basis. The objective of this research is to understand how people view
and interact with online media. [12]
EyeProof
EyeProof is a cloud-based eye-tracking analytics for digital products. EyeProof requies the
EyeTribe sensor and allows to test ads and websites on a computer or a tablet. Results of
eye-tracking are analyzed online in EyeProof platform, with heatmaps, gazepaths, and
statistics. [9]
Assistive Technology
Assistive Technology can be divided into three groups by use purpose:
• First, eye-tracking is used by people with disabilities.
o Eye Conductor
o Eye Conductor is a musical interface that allows people with physical disabilities to
play music through eye movements and facial gestures. Using the EyeTribe eye
tracker and a regular webcam, Eye Conductor detects the gaze and selected facial
movements, thereby enabling people to play any instrument, build beats, sequence
melodies or trigger musical effects. The system is open, designed for inclusion and
can be customised to fit the physical abilities of whoever is using it.
Eye Conductor translates eye gaze into musical notes or beats in a drum sequencer.
Raising your eyebrows can be used to transpose all played notes up one full octave
while opening your mouth can add a delay, reverb or filter effect to the instrument
being played. Thresholds for facial gestures can be adjusted and saved to fit the
unique abilities of different users.
Eye Conductor is programmed in Processing. [2]
• Eyewriter
8. Student Name: Maryna Razakhatskaya Student Number: M00548643
8
o Eyewriter is a low-cost, open source eye-tracking project that allows ALS patients to
draw with eyes. It is inspired and built for the LA graffiti writer TEMPTONE.
EyeWriter consists of an eye-tracking software designed for DIY low-cost glasses [18]
and/or and Tobii (commercial eye-tracker), and a drawing software for drawing with
eye movements. The project has been developed in OpenFrameworks, a cross-
platform C++ library for creative development. [10] It tracks the position of a pupil
and uses a calibration sequence to map the tracked eye/pupil coordinates to
positions on a computer screen or projection. [10]
• Second, eye-tracking is used in computer games with intense scenarios when users
need to perform multiple actions at the same time with the high speed and hands are not
enough. Computer games are an additional niche for eye tracking technology.
o Tobii Apps
o Tobii Apps is for playing PC games with an eye tracking controller. Tobii Apps feature
40 computer games with eye tracking to navigate games environment in a more
intuitive way. [21]
o SteelSeries Engine
o SteelSeries Engine offers gamers to play using their eyes in addition to main hand-
controlled systems. It uses Sentry Eye Tracker - custom eye device with the same
specifications as Tobbi or EyeTribe sensors. [17]
With the development of VR games and eye tracking in VR headsets it's reasonable to expect
wide application of eye tracking in VR games.
• Third, in virtual reality(VR) headsets to nagigate within VR scenes and hands-free
interface. This is the most promising area for mass development of eye tracking where
leaders in the field head to. [13]
Art Installations
EyeTracked Paintings
Eye Tracked Paintings (Dreamstage, 2015) are a set of interactive digital images designed by
Dreamstage [8] to respond to eye movements captured with Tobii EyeX Controller. Images
are downloaded to computer. User interacts with the digital screen. [22]
Eyecode
Eyecode (Golan Levin, 2007) is an interactive installation whose display is wholly constructed
from its own history of being viewed. By means of a hidden camera, the system records and
replays brief video clips of its viewers' eyes. Each clip is articulated by the duration between
two of the viewer's blinks. The unnerving result is a typographic tapestry of recursive
observation.
Eyecode is implemented in OpenFrameworks [15] and uses the OpenCV computer vision
library. [11]
The review of existing eye tracking projects leads to a conclusions that eye tracking technology has
traditinally been used in research, art installations, and assistive technology with a rising trend for
games and VR. Most of the projects use Tobii, EyeTribe, or custom developed eye-trackers. Software
for most of the projects is build in C++ / OpenFrameworks or Unity. It must be emphasized that all of
the eye-tracker-enabled projects are digital with no examples or evidence of using eye tracking as a
control tool for physical interfaces.
9. Student Name: Maryna Razakhatskaya Student Number: M00548643
9
Key Takeaways
iWizard pre-project research consisted of three major steps: overview of hands-free technology,
assessment of eye tracking technology particularly, and finally review, classification, and evaluation
of existing projects that use eye tracking.
Findings of the research:
• Connecting physical and digital worlds is a major trend that stimulates development of
hands-free interfaces controlled by voice, motion, eyes, brainwaves;
• Hands-free interfaces are built for both digital, physical, and hybrid products;
• Eye-tracking and brainwaves are less explored area;
• All of the discovered eye tracking projects are digital only;
• Eye tracking is used in three domains – research (passive), interactive art and assistive
technology (active) – with the potential for wide usage in VR headsets and games;
• There are two affordable eye trackers – EyeTribe and Tobii – with similar specifications and
SDKs for C++, C, Java, .Net, Unity;
• EyeTribe tracker can be used with Macintosh computer and supports Processing language
via a custom built library;
• Three major weaknesses of eye tracking: the need for calibration; limit of screen size of 27’;
and required proximity between the device, user, and computer screen less than 1 m.
These research findings define the goals for iWizard Project:
1. Build the first use case and a proof of concept that eye tracking – alike other types of hands-
free interaction – can be applied in physical products, i.e. smoothly embedded to give eyes
control of physical environment.
2. Challenge the limits of existing eye trackers:
a. Screen size limit of 27’,
b. Distance between the screen, eye tracker, and user less than 1 meter in total,
c. Need for calibration.
3. Find the most optimal and elegant technical solution to connect the eye tracker, custom
software, and microcontroller-enabled physical interface into a single hybrid product.
4. Test and explore the psychology of human eyes behaviour and implement it in the product.
5. Turn the project into a form of art installation that can be further submitted to digital and
interactive art contests.
It is important to note that control of physical environment assumes that this environment is smart.
To better understand this environment, a second round of research has been done to define the
common approaches to control physical objects. While in IoT most of interactions happen
automatically (for example, a device sensed proximity of user and automatically activated a certain
feature), the rest of the interactions are controlled by users manually using their hands (pressing
buttons, touching smartphone screens, etc.). There are very few examples of hands-free interfaces
in IoT and/or mixed reality and there is one project – IoTxMR – to be highlighted as an inspiration for
iWizard.
IoTxMR is an app that lets a user interact with smart home via augmented reality with eyes and
gestures. Microsoft HoloLens app connects various Android and Arduino-based devices and creates a
layer of augmented reality where a digital interface is placed. This digital interface controls physical
devices and is operated by eyes and gestures. [4]
14. Student Name: Maryna Razakhatskaya Student Number: M00548643
14
The program transforms target «screen» into a grid, loops to check current eye gaze position and
performs actions condition to where a spectator gazes at.
Processing was chosed as the most optimal and elegant solution for iWizard installation because:
• iWizard interaction flow is a loop; Processing is based on loops;
• it easily communicates with Arduino;
• there is a custom EyeTribe library for Processing made public;
• it eliminates the need to create unnecessary 3D scenes using Unity plugin;
• it eliminates the need to have programming background to build eye tracking interaction
(i.e. to know C++, Java, .Net).
Communication
Communication between the eye tracker, microcontrollers, custom software, and eye tracker server
is ensured via serial ports. EyeTrive device sends data to computer via USB 3.0 lead. Arduino receives
data via USB lead connected to serial port.
Serial communication has been chosen against wireless/Wi-Fi communication based on speed of
data transmition – eyes move very fast and delay in response negatively affects interaction
experience.
Problems / Solutions
Scientific value of iWizard project is in discovery of solutions (see Table 3) that overcome the limits
of eye tracking device specifications.
Table 3. – Problems and Solutions in Project Progress.
Problem Solution
Invinite
looping of
each action
once each
target area is
spotted
For each conditional case a flag boolean variable has been added to check the
status of action and change it according to interaction logic.
Screen size
limit of 27’ vs
image size of
40’
Hypothesis 1. Based on rules of optics, a screen placed further from the viewer
can be of a large size (see Image 8).
Image 8. – Optical scheme.
Distance
between the
screen, eye
tracker, and
user less
than 1 meter
in total
17. Student Name: Maryna Razakhatskaya Student Number: M00548643
17
Results
iWizard project can be considered successful because it achieved all the six goals of the project,
worked smoothly in accordance with design and interaction concept, and resulted in scientific
discoveries about the work of commercial eye trackers.
1. iWizard proved the concept of possible use of eye tracking to control physical environment.
2. It created the feeling of direct communication between human eyes and physical objects.
3. It broke technical limits of eye tracking devices and showed ways to avoid calibration
routine. iWizard is pre-calibrated only once, uses physical screens of 40’ size, is set with 2-
3m distance between users and images.
4. An optimal and elegant technical solution was found to connect an eye tracker, custom
software, and microcontroller-enabled physical interface into a single hybrid product.
5. iWizard addressed psychological and cultural context of human eyes behaviour and
triggered spectator emotions: surprise, fear, curiosity, superpower feeling.
6. iWizard is an interactive digital art installation that can be further submitted to digital and
interactive art contests like Lumen Prize or exhibited in maker spaces, libraries.
Reflective Summary
iWizard project became a successful final accord of Master's for Creative Technology program.
Throughout this project I have advanced Processing programming skills with the use of different
libraries, setup of serial comunication, advanced multi-conditional loops, 2-dimentional arrays, and
use of boolean variables to flag the state of events.
An in-depth research of creative technology trends, hands-free interfaces, and particularly specifics
of eye tracking and optics helped build expertise in the emerging field and build the project that is
first of a kind working proof of concept.
The area for improvement lays in further advancements of eye tracking technology. VR headsets
and mixed reality glasses like Microsoft Hololense will facilitate the use of eye tracking and will bring
to the market massive amounts of apps that will rely on eyes and gestures to interact with physical
environment. In this context it might be reasonable to build eye-tracking-enabled apps in Unity
rather than in Processing or any other programming language.
While eye tracking has not yet become a mass trends it was logical to introduce eye-tracker-enables
interaction in a form of art installation. iWizard project will be submitted to digital arts contest. It
has been already offered exibit the project at DigiLab digital hub in East London.
iWizard concept can be also used in eye care centers, offices of large companies as an eye fitness
machine. If interactions are designed in a certain way that makes people perform specific patterns of
eye movement, that would help train eye muscles in a funny interactive way and sustain eye health
of spectators.
iWizard is a successful working prototype of a hybrid creative technology product that uses eye
tracking to interact with physical world.
18. Student Name: Maryna Razakhatskaya Student Number: M00548643
18
References
1. Ancxt, s. (2016). Metamorphy. [online] Scenocosme.com. Available at:
http://www.scenocosme.com/metamorphy_e.htm [Accessed 9 Oct. 2016].
2. Andreasrefsgaard.dk. (2016). Eye Conductor | Andreas Refsgaard. [online] Available at:
http://andreasrefsgaard.dk/project/eye-conductor/ [Accessed 9 Oct. 2016].
3. Biggs, J. (2016). The Eye Tribe Tracker Pro Offers Affordable Eye Tracking For $199. [online]
TechCrunch. Available at: https://techcrunch.com/2016/01/14/the-eye-tribe-tracker-pro-
offers-affordable-eye-tracking-for-199/ [Accessed 9 Oct. 2016].
4. Blog.arduino.cc. (2016). Arduino Blog – Control with your smart devices by staring and
gesturing. [online] Available at: https://blog.arduino.cc/2016/07/26/control-with-your-
smart-devices-by-staring-and-gesturing/ [Accessed 9 Oct. 2016].
5. Cardoso, J. (2016). Eye Tribe for Processing. [online] Jorgecardoso.eu. Available at:
http://jorgecardoso.eu/processing/eyetribeprocessing/ [Accessed 9 Oct. 2016].
6. Catalyst Frame. (2016). DIY Hands-Free Computer Interface. [online] Available at:
http://www.catalystframe.com/hands-free-computer-interface/ [Accessed 9 Oct. 2016].
7. Cha, B. (2012). Brainwave-controlled skateboard is totally mental. [online] CNET. Available
at: https://www.cnet.com/uk/news/brainwave-controlled-skateboard-is-totally-mental/
[Accessed 9 Oct. 2016].
8. Dreamstage.se. (2016). DreamStage. [online] Available at: http://dreamstage.se/ [Accessed
9 Oct. 2016].
9. Eyeproof.net. (2016). EyeProof | Analytics. [online] Available at: http://www.eyeproof.net/
[Accessed 9 Oct. 2016].
10. Eyewriter.org. (2016). EyeWriter. [online] Available at: http://www.eyewriter.org/ [Accessed
9 Oct. 2016].
11. Flong.com. (2016). Eyecode - Interactive Art by Golan Levin and Collaborators. [online]
Available at: http://www.flong.com/projects/eyecode/ [Accessed 9 Oct. 2016].
12. Focusproject.co.uk. (2016). FAQs - The Focus Project. [online] Available at:
http://focusproject.co.uk/faqs/ [Accessed 9 Oct. 2016].
13. INTRODUCING EYE TRACKING IN VIRTUAL REALITY. (2016). 1st ed. [ebook] Copenhagen.
Available at: https://theeyetribe.com/wp-content/uploads/2016/01/vr-product-sheet.pdf
[Accessed 9 Oct. 2016].
14. Lab212.org. (2016). Lab212. [online] Available at: http://lab212.org/Moc [Accessed 9 Oct.
2016].
15. Openframeworks.cc. (2016). openFrameworks. [online] Available at:
http://openframeworks.cc/ [Accessed 9 Oct. 2016].
16. PSFK. (2013). How One Artist Paints Using Only Her Eyes - PSFK. [online] Available at:
http://www.psfk.com/2013/05/painting-eye-tracking-tobii-intel.html [Accessed 9 Oct.
2016].
17. Steelseries.com. (2016). Sentry Eye Tracker | SteelSeries. [online] Available at:
https://steelseries.com/gaming-controllers/sentry [Accessed 9 Oct. 2016].
18. The EyeWriter DIY Guide. (2009). 1st ed. [ebook] Q-Branch. Available at:
http://fffff.at/eyewriter/The-EyeWriter.pdf [Accessed 9 Oct. 2016].
19. Theeyetribe.com. (2016). Products – The Eye Tribe. [online] Available at:
https://theeyetribe.com/products/ [Accessed 9 Oct. 2016].
19. Student Name: Maryna Razakhatskaya Student Number: M00548643
19
20. Tobii.com. (2016). This is eye tracking. [online] Available at:
http://www.tobii.com/group/about/this-is-eye-tracking/ [Accessed 9 Oct. 2016].
21. Tobii.com. (2016). Tobii Apps – eye tracking enabled games and apps. [online] Available at:
http://www.tobii.com/xperience/apps/ [Accessed 9 Oct. 2016].
22. Tobii.com. (2016). Tobii eye tracking painting. [online] Available at:
http://www.tobii.com/xperience/apps/eye-tracked-paintings/ [Accessed 9 Oct. 2016].
23. Tobii.com. (2016). Tobii EyeX Controller – get your own eye tracker. [online] Available at:
http://www.tobii.com/xperience/products/ [Accessed 9 Oct. 2016].
24. Yarbus, A. (1967). Eye Movements and Vision. 1st ed. [ebook] New York: Plenum Press.
Available at:
http://wexler.free.fr/library/files/yarbus%20(1967)%20eye%20movements%20and%20visio
n.pdf [Accessed 9 Oct. 2016].