SlideShare a Scribd company logo
1 of 19
Download to read offline
School of Science and Technology
MSc/MA Creative Technology
MDA4605
Final Project Report
«iWizard: Using Eye Tracking Technology to Control the Physical Environment»
Tutor(s): Magnus Moar
Student Name: Maryna Razakhatskaya
Student ID Number: M00548643
Date: 10 October 2016
Project Video: 	https://vimeo.com/186337188
Student	Name:	Maryna	Razakhatskaya	 Student	Number:	M00548643
	
2	
Contents	
Introduction	 3	
Technical	Review	 4	
Overview	of	Hands-Free	Technology	 4	
Eye	Tracking	Technology	 5	
Eye	Tracking:	Existing	Use	Cases	by	Domain	 7	
Marketing	Research	 7	
Assistive	Technology	 7	
Art	Installations	 8	
Key	Takeaways	 9	
Method	/	Project	Progress	 10	
iWizard	Project	Description	 10	
Design	And	Implementation	Decisions	 11	
Art	Concept	 11	
Product	Design	 11	
Interaction	Design	 12	
Hardware	 13	
Software	 13	
Communication	 14	
Problems	/	Solutions	 14	
Results	 17	
Reflective	Summary	 17	
References	 18
Student	Name:	Maryna	Razakhatskaya	 Student	Number:	M00548643
	
3	
Introduction	
iWizard	is	a	creative	technology	project	that	explores	application	of	eye-tracking	technology	to	
control	physical	interfaces	remotely.	iWizard	is	a	hybrid	hardware-software	product	that	consists	of:	
• theEyeTribe	eye-tracking	device,	
• two	versions	of	physical	interfaces	hand-crafted	and	operated	by	Arduino	microcontrollers,	
• custom	software	developed	in	Processing	3.0	that	receives	data	from	the	eye-tracker	and	
interprets	and	transmits	it	to	microcontrollers.	
iWizard	is	the	first	known	experiment	of	using	eye-tracking	to	actively	control	physical	environment.		
Previous	applications	of	active	eye-tracking	technology	required	direct	user	interaction	with	a	
computer	/	digital	environment	and	have	mostly	been	used	as	assistive	technology	for	people	with	
disabilities.	Nowadays,	emerging	technology	demonstrates	another	use	case	for	active	eye-tracking	-		
ivirtual	reality.	Cases	where	digital	screen	or	VR	headset	is	not	required	relate	to	passive	interactions	
when	users	provide	visual	input	but	do	not	receive	any	feedback.	Passive	eye-tracking	has	proven	
itself	as	an	effective	tool	of	visual	behaviour	study	and	is	widely	applied	in	marketing	/	educational	/	
usability	research.	
Eye-tracking	methodology	exists	since	1800s	and	was	initially	based	on	direct	observations	of	eye	
movement	in	the	process	of	reading.	With	the	rise	of	computer	technology	in	1980s,	eye-tracking	
found	a	new	niche	in	human-computer	interaction	studies.	Eye-tracking	devices	remained	costly	
experimental	lab-only	technology	up	to	2014	when	two	low-cost	eye-tracking	sensors	were	
introduced	to	the	market	by	companies	Tobii	and	EyeTribe.		
Since	2014,	eye-tracking	is	an	affordable	technology	with	high	potential	to	explore.	[3]	
The	entire	interest	to	eye-tracking	technology	is	explained	by	the	next	shift	in	emerging	technology	
when	digital	transformation	stands	down	for	“phygital”	transformation	-	the	merge	of	physical	and	
digital	environments.	The	early	attempts	to	connect	physical	to	digital	used	QR-codes,	NFC,	
ibeacons,	etc.	New	major	technological	trends	in	this	domain	like	Internet	of	Things	and	mixed	
reality	have	set	new	requirements	for	methods	and	tools	of	human-computer	interactions.	In	this	
context,	eye-trackers	get	into	the	same	category	as	Leap	Motion,	Microsoft	Kinect,	Siri,	etc.	being	a	
data	input	alternative	to	keyboard/mouse/touchscreen.	In	certain	environments	like	virtual	reality	
and	sometimes	physical	spaces,	the	use	of	buttons	or	touchscreens	might	not	be	possible	and	will	
require	smooth	replacement	with	contactless	systems	controlled	by	motion,	voice,	eye	gaze	or	any	
other	input	that	can	be	captured	by	sensors.	
Cultural	context	also	proves	people’s	will	to	use	eyes	to	control	things.	Most	languages	contain	
idioms	and	metaphors	in	which	vision	becomes	an	active	tool:	
“wither	with	a	look”	|	“if	looks	could	kill”	
“his	eyes	bored	holes	in	me”	
Moving	objects	with	eyes	have	always	been	considered	a	superpower	or	an	extrasensory	magical	
ability	repeatedly	mentioned	in	fairy	tales,	legends,	and	science	fiction	books.	All	these	represent	a	
hidden	strive	of	humanity	to	empower	eyes	to	perform	actions.	
Taking	into	account	technological	and	cultural	context	as	well	as	the	technical	feasibility	of	today’s	
eye-tracking,	iWizard	project	explores	and	designs	new	relevant	human-computer	interaction	
models.			
The	report	will	begin	with	an	technical	review	of	existing	eye-tracker-enabled	projects	in	different	
domains.	It	will	contain	analysis	of	strengths	and	weaknesses	of	existing	projects	and	will	define	the	
opportunity	scope	for	iWizard	project.	Second	part	of	the	report	will	describe	the	final	product	and	
its	design	workflow.	Experiments	and	practical	findings	are	documented	in	this	chapter.	The
Student	Name:	Maryna	Razakhatskaya	 Student	Number:	M00548643
	
4	
description	will	also	explain	how	design	decisions	were	made	both	on	hardware	and	software	levels.		
Results	section	will	weight	on	project	success.	Reflective	statement	will	conclude	on	learning	
experience	and	summarize	iWizard	project	potential.		
Technical	Review	
Overview	of	Hands-Free	Technology	
The	recent	advancement	and	affordability	of	sensors	stimulated	strong	interest	in	developing	hands-
free	interfaces.	Hands-free	interfaces	assume	using	alternative	methods	to	control	digital	and	
physical	environments.	These	methods	rely	on	taking	these	specific	types	of	human	body	activities	
that	do	not	require	hand	touch:			
• Voice	(Siri,	OK	Google,	assistive	software	in	Apple	computers),		
• Motion	/	proximity	(Leap	Motion,	Microsoft	Kinect,	iBeacons,	GPS	tracking),	
• Brainwaves	(EEG	headsets	-	Emotiv,	NeuroSky,	BrainLink,	etc.)		
• Eyes	(eye	trackers	–	EyeTribe,	Tobii,	SteelSeries	Sentry,	Fovio,	etc.).	
These	alternative	input	systems	can	be	used	to	control	both	digital	/	computer	interfaces	and	
physical	objects.	Below	is	the	review	of	recent	innovative	projects	that	use	motion,	brainwaves,	and	
eyes	in	digital	and	physical	environments.	
Table	1.	–	Comparative	Review	of	Four	Examples	of	Digital	and	Physical	Projects	that	Are	Controlled	
with	Voice,	Eyes,	Motion	or	Brainwaves.	
		 	 Example	1.	 Example	2.	 Example	3.	 Example	4.	
Name:		 	 DIY	Hands-Free	
Computer	Interface	
Metamorphy	by	
Scenocosme		
Board	of	
Imagination		
MOC		
Image:	
	 	 	 	
Product:	 Digital	 Physical	 Physical	 Digital	
Input	Type:	 Eyes,	Motion	 Motion	 Brainwaves	 Voice	
Usage	 Practical	/	Assistive	
Technology	
Art	Installation	 Practical	 Art	Installation	
Description
:	
Computer	interface	
that	lets	users	
operate	a	computer	
using	eyes	and	
muscles	instead	of	
mouse	and	keyboard	
Metamorphy	is	a	
visual	and	sonorous	
interactive	artwork	
that	invites	to	touch	
and	explore	the	
depth	of	the	semi-
transparent	veil.		
Board	of	
Imagination	is	a	
brain-operated	
motorized	
scateboard	that	
controls	speed	and	
itinerary.	
Moc	explores	the	
relationship	
between	sound	
and	image.	
Whistling	into	a	
microphone,	a	
tree	grows.	
Sensors	/	
Hardware:
	 	
EyeTribe	eye	tracker,	
Arduino	Uno,	
Advancer	
Technologies	EMG	kit	
Kinect,		
Projector,	Mirror,	
Audio	System	
Emotiv	EPOC	
headset,	Samsung	
tablet,	electric	
motor	
Microphone,	
computer,	
projector	
Code:	 	 EyeTribe	SDK,	
Arduino,	Python,	
Pyserial,	Pyautogui	
C++,	Maxmsp	+	Jitter	 not	disclosed	 not	disclosed	
Sources:	[1,	6,	7,	14].
Student	Name:	Maryna	Razakhatskaya	 Student	Number:	M00548643
	
5	
Examples	in	Table	1	demonstrate	a	wide	range	of	areas	for	technological	application	of	new	input	
methods.	With	various	combinations	of	hardware	and	software	technologies,	we	are	capable	of	
producing	hybrid	digital	and	physical	products	that	have	both	artistic	value	and	practical	application.	
While	voice	and	motion	controlled	apps	and	objects	are	quite	established	and	widely	promoted	by	
authoring	products	like	Leap	Motion,	Kinect,	the	usage	of	eye	tracking	and	brainwaves	remains	
comparatively	unexplored.	This	is	explained	by	a	backlog	in	hardware	production.	Eye	trackers	and	
brain	sensors	until	2014	were	expensive	lab-only	massive	equipment.	Since	2014	compact	affordable	
consumer	devices	were	introduced	in	both	fields.	However,	most	of	the	devices	still	remain	
experimental	prototypes	and	are	limited	to	access	by	general	public.	
Eye	Tracking	Technology	
Since	2014,	eye-tracking	is	an	affordable	technology	with	a	wide	potential.	There	are	two	types	of	
commercial	devices	available	on	the	market	–	sensors	and	glasses	–	that	are	non-invasive	optical	
eye-trackers.		
Sensors	remain	most	affordable,	use	infra-red	light	emitters	and	a	camera,	come	as	development	
kits	with	software	and/or	SDKs.	Unlike	glasses,	sensors	have	certain	limits	on	distance	and	
positioning	and	in	most	cases	require	calibration	for	each	user	which	complicates	interaction	flow.		
Eye-tracker	is	a	sensor	that	captures	eye	activity	and	returns	data	of	the	following	types:	
• gaze	points	–	point	where	user	looks	at;	
• fixations	–	points	where	user	attention	focuses	on;	
• saccades	–	vectors	of	large	eye	movements.	
These	standard	data	types	are	used	as	inputs	in	active	eye-tracking	(i.e.	to	control	interfaces)	or	as	
outputs	in	passive	applications	(i.e.	visual	behavior	research).	
Image	1.	–	How	the	Eye	Tracker	Works	
Image	source:	[20]	
There	are	two	commercial	affordable	beta	eye	trackers	available	on	the	market:	EyeTribe	and	Tobii.	
The	Eye	Tribe	is	a	Danish	company	started	by	PhD	researches	in	2011	and	Tobii	is	a	Swedish	
company	that	has	grown	from	a	hardware	startup	in	2001	to	a	global	leader	in	eye	tracking
Student	Name:	Maryna	Razakhatskaya	 Student	Number:	M00548643
	
6	
technology	by	2016.	Table	2	provides	a	detailed	comparison	of	Tobii	and	EyeTribe	devices	based	on	
technical	specifications.	
Table	2.	-	Comparison	of	EyeTribe	and	Tobii	Sensors	Based	on	Tech	Specifications:	
	 EyeTribe	Tracker	Pro	 	Tobii	EyeX	Controller	
Latency	 <	16	ms	 	15+/-5ms	
Operating	range	 45cm	–	75cm	 		45-100	cm	
Tracking	area	 50cm	x	30cm		
at	65cm	distance	
	45-100	cm	
	
Screen	sizes	 Up	to	27”	 	Up	to	27”	
API/SDK	 Java,	C++,	C#,	Unity,	Processing	 	UE4,	C,	C++,	.NET,	Unity	
Data	output	 Binocular	gaze	data	(x/y	screen	
coordinate)	
3D	eye	position	
Pupil	diameter	in	mm	
	Gaze	point,	
	Eye	positon,	
	Fixations.	
Dimensions	(W/H/D)	 22x15x220	mm	 	20x15x230	mm	
Weight	 <100	grams	 	0.2	lb	/	91	grams	
Interface	 USB3.0	 	USB3.0	
Operating	System	 Windows	10,	Windows	8,	Windows	7,		
MacOS,		
Android	
	Windows	10,		
	Windows	8.1,			
	Windows	7	
Sources:	[19,	23].	
Table	2	displays	relative	similarity	of	the	EyeTribe	and	Tobii	EyeX	devices	based	on	the	majority	of	
technical	specifications.	For	developers	building	applications	for	Mac	and	Android	devices	as	well	as	
for	users	of	these	devices,	EyeTribe	will	be	preferable	and	only	version.	
After	a	search	of	custom	third-party	SDKs	and	libraries,	it	turned	out	that	EyeTribe	device	can	be	also	
used	with	Processing	environment	due	to	«Eye	Tribe	for	Processing»	library	developed	and	
published	by	Jorge	C.	S.	Cardoso.	The	library	currently	provides	functions	to	get	the	gaze	point,	the	
eye	coordinates,	and	to	allow	calibrating	the	device	within	the	Processing	sketch.	[5]	
Dimensions	of	both	devices	are	quite	small	and	make	those	handy	to	use.	Operating	range	and	
screen	size	limitations	–	adversely	–	set	certain	limitations	to	the	use	of	eye	trackers	with	bigger	
screens,	physical	interfaces,	and	at	a	longer	distance.	
When	building	products	with	remote	eye	trackers	it	is	critical	to	consider	the	need	for	calibration	
that	remore	eye	trackers	require	to	work	with	accuracy.	The	process	usually	means	following	the	
dots	on	a	screen,	takes	up	to	two	minutes,	and	usually	becomes	a	barrier	to	design	smooth	
interactions	with	eye	tracking.
Student	Name:	Maryna	Razakhatskaya	 Student	Number:	M00548643
	
7	
Eye	Tracking:	Existing	Use	Cases	by	Domain	
Eye	tracking	technology	application	is	defined	by	the	nature	and	behavior	of	human	eyes.	Eyes	move	
at	a	very	high	speed,	eyes	can	focus,	eyes	can	be	opened/	closed/	blinking.	[24]		In	regards	to	
practical	application	there	are	three	main	reasons	when	eye	tracking	becomes	an	appropriate	
solution:	
• to	collect	visual	behavior	data	for	marketing,	educational,	UX	research,	
• to	control	digital	and	physical	interfaces	when	hands	are	not	available,	
• to	add	an	unexpected	interesting	element	to	interactive	art	installations.	
These	three	main	reasons	let	all	existing	eye-tracker-enabled	projects	to	be	classified	into	three	
categoriess:	research,	assistive	technology,	art	installations.	Below	is	the	review	of	projects	of	each	
category.	
Marketing Research
The	Focus	Project	
The	Focus	Project	is	a	research	project	that	involves	installation	of	a	non-intrusive	eye	
tracking	device	(Tobii	EyeX	Controller)	that	will	record	what	users	look	at	whilst	they	browse	
the	internet	on	a	daily	basis.	The	objective	of	this	research	is	to	understand	how	people	view	
and	interact	with	online	media.	[12]	
EyeProof	
EyeProof	is	a	cloud-based	eye-tracking	analytics	for	digital	products.	EyeProof	requies	the	
EyeTribe	sensor	and	allows	to	test	ads	and	websites	on	a	computer	or	a	tablet.	Results	of	
eye-tracking	are	analyzed	online	in	EyeProof	platform,	with	heatmaps,	gazepaths,	and	
statistics.	[9]	
Assistive Technology
Assistive	Technology	can	be	divided	into	three	groups	by	use	purpose:	
• First,	eye-tracking	is	used	by	people	with	disabilities.	
o Eye	Conductor	
o Eye	Conductor	is	a	musical	interface	that	allows	people	with	physical	disabilities	to	
play	music	through	eye	movements	and	facial	gestures.	Using	the	EyeTribe	eye	
tracker	and	a	regular	webcam,	Eye	Conductor	detects	the	gaze	and	selected	facial	
movements,	thereby	enabling	people	to	play	any	instrument,	build	beats,	sequence	
melodies	or	trigger	musical	effects.	The	system	is	open,	designed	for	inclusion	and	
can	be	customised	to	fit	the	physical	abilities	of	whoever	is	using	it.	
Eye	Conductor	translates	eye	gaze	into	musical	notes	or	beats	in	a	drum	sequencer.	
Raising	your	eyebrows	can	be	used	to	transpose	all	played	notes	up	one	full	octave	
while	opening	your	mouth	can	add	a	delay,	reverb	or	filter	effect	to	the	instrument	
being	played.	Thresholds	for	facial	gestures	can	be	adjusted	and	saved	to	fit	the	
unique	abilities	of	different	users.	
Eye	Conductor	is	programmed	in	Processing.	[2]	
• Eyewriter
Student	Name:	Maryna	Razakhatskaya	 Student	Number:	M00548643
	
8	
o Eyewriter	is	a	low-cost,	open	source	eye-tracking	project	that	allows	ALS	patients	to	
draw	with	eyes.	It	is	inspired	and	built	for	the	LA	graffiti	writer	TEMPTONE.	
EyeWriter	consists	of	an	eye-tracking	software	designed	for	DIY	low-cost	glasses	[18]	
and/or	and	Tobii	(commercial	eye-tracker),	and	a	drawing	software	for	drawing	with	
eye	movements.	The	project	has	been	developed	in	OpenFrameworks,	a	cross-
platform	C++	library	for	creative	development.	[10]	It	tracks	the	position	of	a	pupil	
and	uses	a	calibration	sequence	to	map	the	tracked	eye/pupil	coordinates	to	
positions	on	a	computer	screen	or	projection.	[10]	
• Second,	eye-tracking	is	used	in	computer	games	with	intense	scenarios	when	users	
need	to	perform	multiple	actions	at	the	same	time	with	the	high	speed	and	hands	are	not	
enough.	Computer	games	are	an	additional	niche	for	eye	tracking	technology.		
o Tobii	Apps	
o Tobii	Apps	is	for	playing	PC	games	with	an	eye	tracking	controller.	Tobii	Apps	feature	
40	computer	games	with	eye	tracking	to	navigate	games	environment	in	a	more	
intuitive	way.	[21]	
o SteelSeries	Engine	
o SteelSeries	Engine	offers	gamers	to	play	using	their	eyes	in	addition	to	main	hand-
controlled	systems.	It	uses	Sentry	Eye	Tracker	-	custom	eye	device	with	the	same	
specifications	as	Tobbi	or	EyeTribe	sensors.	[17]	
With	the	development	of	VR	games	and	eye	tracking	in	VR	headsets	it's	reasonable	to	expect	
wide	application	of	eye	tracking	in	VR	games.
• Third,	in	virtual	reality(VR)	headsets	to	nagigate	within	VR	scenes	and	hands-free	
interface.	This	is	the	most	promising	area	for	mass	development	of	eye	tracking	where	
leaders	in	the	field	head	to.	[13]		
Art Installations
EyeTracked	Paintings	
Eye	Tracked	Paintings	(Dreamstage,	2015)	are	a	set	of	interactive	digital	images	designed	by	
Dreamstage	[8]	to	respond	to	eye	movements	captured	with	Tobii	EyeX	Controller.	Images	
are	downloaded	to	computer.	User	interacts	with	the	digital	screen.	[22]	
Eyecode	
Eyecode	(Golan	Levin,	2007)	is	an	interactive	installation	whose	display	is	wholly	constructed	
from	its	own	history	of	being	viewed.	By	means	of	a	hidden	camera,	the	system	records	and	
replays	brief	video	clips	of	its	viewers'	eyes.	Each	clip	is	articulated	by	the	duration	between	
two	of	the	viewer's	blinks.	The	unnerving	result	is	a	typographic	tapestry	of	recursive	
observation.		
Eyecode	is	implemented	in	OpenFrameworks	[15]	and	uses	the	OpenCV	computer	vision	
library.	[11]	
The	review	of	existing	eye	tracking	projects	leads	to	a	conclusions	that	eye	tracking	technology	has	
traditinally	been	used	in	research,	art	installations,	and	assistive	technology	with	a	rising	trend	for	
games	and	VR.	Most	of	the	projects	use	Tobii,	EyeTribe,	or	custom	developed	eye-trackers.	Software	
for	most	of	the	projects	is	build	in	C++	/	OpenFrameworks	or	Unity.	It	must	be	emphasized	that	all	of	
the	eye-tracker-enabled	projects	are	digital	with	no	examples	or	evidence	of	using	eye	tracking	as	a	
control	tool	for	physical	interfaces.
Student	Name:	Maryna	Razakhatskaya	 Student	Number:	M00548643
	
9	
Key	Takeaways	
iWizard	pre-project	research	consisted	of	three	major	steps:	overview	of	hands-free	technology,	
assessment	of	eye	tracking	technology	particularly,	and	finally	review,	classification,	and	evaluation	
of	existing	projects	that	use	eye	tracking.	
Findings	of	the	research:	
• Connecting	physical	and	digital	worlds	is	a	major	trend	that	stimulates	development	of	
hands-free	interfaces	controlled	by	voice,	motion,	eyes,	brainwaves;	
• Hands-free	interfaces	are	built	for	both	digital,	physical,	and	hybrid	products;	
• Eye-tracking	and	brainwaves	are	less	explored	area;	
• All	of	the	discovered	eye	tracking	projects	are	digital	only;	
• Eye	tracking	is	used	in	three	domains	–	research	(passive),	interactive	art	and	assistive	
technology	(active)	–	with	the	potential	for	wide	usage	in	VR	headsets	and	games;	
• There	are	two	affordable	eye	trackers	–	EyeTribe	and	Tobii	–	with	similar	specifications	and	
SDKs	for	C++,	C,	Java,	.Net,	Unity;	
• EyeTribe	tracker	can	be	used	with	Macintosh	computer	and	supports	Processing	language	
via	a	custom	built	library;	
• Three	major	weaknesses	of	eye	tracking:	the	need	for	calibration;	limit	of	screen	size	of	27’;	
and	required	proximity	between	the	device,	user,	and	computer	screen	less	than	1	m.	
These	research	findings	define	the	goals	for	iWizard	Project:	
1. Build	the	first	use	case	and	a	proof	of	concept	that	eye	tracking	–	alike	other	types	of	hands-
free	interaction	–	can	be	applied	in	physical	products,	i.e.	smoothly	embedded	to	give	eyes	
control	of	physical	environment.	
2. Challenge	the	limits	of	existing	eye	trackers:	
a. Screen	size	limit	of	27’,	
b. Distance	between	the	screen,	eye	tracker,	and	user	less	than	1	meter	in	total,	
c. Need	for	calibration.	
3. Find	the	most	optimal	and	elegant	technical	solution	to	connect	the	eye	tracker,	custom	
software,	and	microcontroller-enabled	physical	interface	into	a	single	hybrid	product.	
4. Test	and	explore	the	psychology	of	human	eyes	behaviour	and	implement	it	in	the	product.	
5. Turn	the	project	into	a	form	of	art	installation	that	can	be	further	submitted	to	digital	and	
interactive	art	contests.	
It	is	important	to	note	that	control	of	physical	environment	assumes	that	this	environment	is	smart.	
To	better	understand	this	environment,	a	second	round	of	research	has	been	done	to	define	the	
common	approaches	to	control	physical	objects.	While	in	IoT	most	of	interactions	happen	
automatically	(for	example,	a	device	sensed	proximity	of	user	and	automatically	activated	a	certain	
feature),	the	rest	of	the	interactions	are	controlled	by	users	manually	using	their	hands	(pressing	
buttons,	touching	smartphone	screens,	etc.).	There	are	very	few	examples	of	hands-free	interfaces	
in	IoT	and/or	mixed	reality	and	there	is	one	project	–	IoTxMR	–	to	be	highlighted	as	an	inspiration	for	
iWizard.	
IoTxMR	is	an	app	that	lets	a	user	interact	with	smart	home	via	augmented	reality	with	eyes	and	
gestures.	Microsoft	HoloLens	app	connects	various	Android	and	Arduino-based	devices	and	creates	a	
layer	of	augmented	reality	where	a	digital	interface	is	placed.	This	digital	interface	controls	physical	
devices	and	is	operated	by	eyes	and	gestures.	[4]
Student	Name:	Maryna	Razakhatskaya	 Student	Number:	M00548643
	
10	
Taking	IoTxMR	project	as	an	example,	it	was	decided	to	possibly	minimize	and	shorten	the	
interaction	flow	from	UseràAR	layeràPhysical	World	to	just	UseràPhysical	World.	
The	progress	of	iWizard	project	and	the	decision	about	design	and	technology	are	based	on	the	
findings	of	this	technical	review	and	the	goals	defined	as	a	result	of	this	research.		
Method	/	Project	Progress	
iWizard	Project	Description		
iWizard	is	an	interactive	art	installation	that	consists	of	two	physical	pen-and-ink	images	and	an	eye	
tracker.	Interactivity	is	supported	by	an	eyetracker	that	captures	users'	eye	movement;	the	images	
are	turned	to	action	by	spectator's	eye	movement.	
Image	2.	–	iWizard	Interactive	Art	Installation	
		 	
	
• The	1st	image	is	a	house	with	large	windows.	It	explores	human	physology	of	looking	into	
windows	of	the	houses	–	curiousity	and	the	fear	of	being	spotted.	There	are	six	windows	five	
of	each	respond	to	being	looked	in	with	motion,	sound,	and	light.	It	contains	3D-printed	
objects	and	a	real	plant.	Interaction	is	infinite.	Image	dimentions	are	84x60	cm	which	is	
comparable	with	40’	vertical	screen.	
	
• The	2nd
	image	is	a	set	of	two	copies	of	the	same	image	of	a	kitchen	but	with	6	differences.	It	
points	at	and	uncovers	human	natural	tendency	to	compare	things	and	spot	differences.	
Each	spotted	difference	is	highlighted	with	light,	sound,	or	motion.	When	all	differences	are	
spotted,	a	user	receives	audio	congratulations	and	interaction	finishes.	Image	dimentions	
are	60x84	cm	which	is	comparable	with	40’	horizontal	screen.
Student	Name:	Maryna	Razakhatskaya	 Student	Number:	M00548643
	
11	
• Eye	tracker	is	set	up	on	the	table	2-3	meters	away	from	the	images.	User	sits	in	a	chair	in	
front	of	the	eye	tracker	with	eyes	about	20	cm	above	the	eye	tracker	and	body	30-40	cm	far	
from	the	eye	tracker.	
iWizard	interaction	is	programmed	fully	in	Processing	3.0	with	the	use	of	the	Eye	Tribe	UI	and	server.	
It	is	equipped	by	the	Eye	Tribe	tracker,	an	electronics	system	of	lights	and	motors	controlled	by	
Arduino	microcontroller,	and	a	MacBook	Air	computer	that	is	hidden.	
Calibration	phase	is	eliminated	from	the	interaction	flow.	Calibration	is	made	in	advance	with	the	
use	of	projector	and	has	proved	working	with	high	accuracy	for	different	users	for	the	fixed	position	
of	the	images,	eye-tracker,	and	user	in	physical	space.	
	
Design	And	Implementation	Decisions		
Design	and	implementation	decisions	were	based	on	project	goals	and	research	findings	and	were	
adjusted	during	development	and	making	process.	
Project	process	timeline	is	defined	by	the	following	stages	of	work	followed	by	making	and	testing	
parts.			
Art Concept
Art	concept	was	the	most	challenging	part	of	the	process.	It	started	with	listing	language	idioms	and	
cases	from	folklore	that	give	a	hint	of	people's	cravings	to	have	a	super	power	to	do	things	with	eyes	
(see	Image	3).	Eventually,	the	concept	got	deeper	to	exporation	of	curiousity,	the	fear	of	our	eye	
activity	being	spotted,	and	the	intuitively	critical	thinking	that	is	reflected	in	human	tendency	to	
compare	things.	
Image	4.	-	Sketches	
		 	 	
Product Design
Product	design	was	based	on	access	to	materials,	the	need	for	images	to	be	a	grid	/	coordinate	
system	and	have	a	rectangular	shape,	and	project	goals	to	challenge	eye	tacker	specifications	and	
create	direct	interaction	between	user	and	images.	Based	on	that,	images	are	simple	pen-and-ink	
black	and	white	cardboards	with	3D	printed	(cat)	and	natural	(plant)	elements.	Computer	is	hidden	
and	is	does	not	interact	with	a	user.
Student	Name:	Maryna	Razakhatskaya	 Student	Number:	M00548643
	
12	
Image	5.	–	Product	Design	Process	Photographs	
		 		 		 	
Interaction Design
Interactions	were	designed	as	use	cases	supporting	general	art	concept.	Interaction	flow	was	built	
using	XMind	software	(see	Images	6-7).	
Image	6.	–	«Differences»	Interaction	Flow	Diagram
Student	Name:	Maryna	Razakhatskaya	 Student	Number:	M00548643
	
13	
Image	7.	-	«House»	Interaction	Flow	Diagram	
	
Hardware
The	following	hardware	is	used	in	iWizard:	
• 2	Arduino	Uno	microcontrollers	–	one	for	each	image.	(The	initial	idea	was	to	use	wireless	
Particle	Photon	Wi-Fi	board	but	it	proved	to	be	not	possible	in	high	security	wireless	
networks.	Plus,	communication	via	internet	using	HTTP	protocol	was	too	slow	for	
transmitting	eye	movement	data).	
• The	EyeTribe	eye	tracker.	(Device	choice	was	defined	by	its	availability,	compatibility	with	
MacOS,	existence	of	The	Eye	Tribe	library	for	Processing.)	
Software
iWizard	software	is	programmed	entirely	in	Processing	3.0	with	the	use	of	a	set	of	libraries:	
• The	Eye	Tribe	Library	for	Processing	by	Jose	Cardoso	–	parses	data	from	the	eye	tracker;	
• Serial,	Arduino,	and	Firmata	libraries	to	ensure	communication	with	Arduino;	
• Sound	library	to	produce	audio	effects.
Student	Name:	Maryna	Razakhatskaya	 Student	Number:	M00548643
	
14	
The	program	transforms	target	«screen»	into	a	grid,	loops	to	check	current	eye	gaze	position	and	
performs	actions	condition	to	where	a	spectator	gazes	at.	
Processing	was	chosed	as	the	most	optimal	and	elegant	solution	for	iWizard	installation	because:	
• iWizard	interaction	flow	is	a	loop;	Processing	is	based	on	loops;	
• it	easily	communicates	with	Arduino;	
• there	is	a	custom	EyeTribe	library	for	Processing	made	public;	
• it	eliminates	the	need	to	create	unnecessary	3D	scenes	using	Unity	plugin;	
• it	eliminates	the	need	to	have	programming	background	to	build	eye	tracking	interaction	
(i.e.	to	know	C++,	Java,	.Net).	
Communication
Communication	between	the	eye	tracker,	microcontrollers,	custom	software,	and	eye	tracker	server	
is	ensured	via	serial	ports.	EyeTrive	device	sends	data	to	computer	via	USB	3.0	lead.	Arduino	receives	
data	via	USB	lead	connected	to	serial	port.	
Serial	communication	has	been	chosen	against	wireless/Wi-Fi	communication	based	on	speed	of	
data	transmition	–	eyes	move	very	fast	and	delay	in	response	negatively	affects	interaction	
experience.	
Problems	/	Solutions	
Scientific	value	of	iWizard	project	is	in	discovery	of	solutions	(see	Table	3)	that	overcome	the	limits	
of	eye	tracking	device		specifications.	
Table	3.	–	Problems	and	Solutions	in	Project	Progress.		
Problem	 Solution	
Invinite	
looping	of	
each	action	
once	each	
target	area	is	
spotted	
For	each	conditional	case	a	flag	boolean	variable	has	been	added	to	check	the	
status	of	action	and	change	it	according	to	interaction	logic.	
Screen	size	
limit	of	27’	vs	
image	size	of	
40’	
Hypothesis	1.	Based	on	rules	of	optics,	a	screen	placed	further	from	the	viewer	
can	be	of	a	large	size	(see	Image	8).	
Image	8.	–	Optical	scheme.	
	
Distance	
between	the	
screen,	eye	
tracker,	and	
user	less	
than	1	meter	
in	total
Student	Name:	Maryna	Razakhatskaya	 Student	Number:	M00548643
	
15	
Testing	1.	Hypothesis	has	been	tested	by	means	of	a	projection	located	3	meters	
from	the	user	(see	Image	9).	Testing	1	proved	assumption	to	be	true.	
Image	9.	–	Testing	1	of	Hypothesis	1.	
	
Testing	2.	Hypothesis	has	been	tested	with	Arduino-enabled	physical	images	
located	3	meters	far	from	the	user	(see	Image	10).	Testing	2	proved	assumption	to	
be	true	again.	
Image	10.	–	Testing	2	of	Hypothesis	1.	
	
Finding:	Eye	Tracking	can	be	used	for	larger	screens	and	surfaces	and	at	a	longer	
distance	between	a	user	and	a	screen.	
Need	for	
calibration	
for	each	user	
each	time.	
Solution	1:	to	build	a	physical	device	for	calibration	to	replace	a	computer.	
Visual	sketching	of	the	device	based	on	requirements	specification	proved	
impracticality	of	this	solution.		
Image	11.	–	Sketch	for	Solution	1.
Student	Name:	Maryna	Razakhatskaya	 Student	Number:	M00548643
	
16	
	
Solution	2:	to	use	the	concept	of	manual	gunsight	to	replace	computer	
calibration.	
Cocept	worked	but	with	quite	low	accuracy	and	required	to	set	another	limit	–	
distance	between	user	and	eye	tracker.			
Image	11.	–	Concept	for	Solution	2.	
	
Solution	3:	Use	projection	with	the	same	aspect	ratio	to	pre-calibrate	eye	tracker	
in	a	fixed	setup.	
Cocept	was	proved	successful.	Different	users	were	able	to	interact	with	
installation	with	just	a	slight	accuracy	offset	up	to	6	cm.			
Image	12.	–	Concept	for	Solution	3.
Student	Name:	Maryna	Razakhatskaya	 Student	Number:	M00548643
	
17	
Results	
iWizard	project	can	be	considered	successful	because	it	achieved	all	the	six	goals	of	the	project,	
worked	smoothly	in	accordance	with	design	and	interaction	concept,	and	resulted	in	scientific	
discoveries	about	the	work	of	commercial	eye	trackers.	
1. iWizard	proved	the	concept	of	possible	use	of	eye	tracking	to	control	physical	environment.	
2. It	created	the	feeling	of	direct	communication	between	human	eyes	and	physical	objects.	
3. It	broke	technical	limits	of	eye	tracking	devices	and	showed	ways	to	avoid	calibration	
routine.	iWizard	is	pre-calibrated	only	once,	uses	physical	screens	of	40’	size,	is	set	with	2-
3m	distance	between	users	and	images.	
4. An	optimal	and	elegant	technical	solution	was	found	to	connect	an	eye	tracker,	custom	
software,	and	microcontroller-enabled	physical	interface	into	a	single	hybrid	product.		
5. iWizard	addressed	psychological	and	cultural	context	of	human	eyes	behaviour	and	
triggered	spectator	emotions:	surprise,	fear,	curiosity,	superpower	feeling.	
6. iWizard	is	an	interactive	digital	art	installation	that	can	be	further	submitted	to	digital	and	
interactive	art	contests	like	Lumen	Prize	or	exhibited	in	maker	spaces,	libraries.	
Reflective	Summary	
iWizard	project	became	a	successful	final	accord	of	Master's	for	Creative	Technology	program.	
Throughout	this	project	I	have	advanced	Processing	programming	skills	with	the	use	of	different	
libraries,	setup	of	serial	comunication,	advanced	multi-conditional	loops,	2-dimentional	arrays,	and	
use	of	boolean	variables	to	flag	the	state	of	events.		
An	in-depth	research	of	creative	technology	trends,	hands-free	interfaces,	and	particularly	specifics	
of	eye	tracking	and	optics	helped	build	expertise	in	the	emerging	field	and	build	the	project	that	is	
first	of	a	kind	working	proof	of	concept.	
The	area	for	improvement	lays	in	further	advancements	of	eye	tracking	technology.		VR	headsets	
and	mixed	reality	glasses	like	Microsoft	Hololense	will	facilitate	the	use	of	eye	tracking	and	will	bring	
to	the	market	massive	amounts	of	apps	that	will	rely	on	eyes	and	gestures	to	interact	with	physical	
environment.	In	this	context	it	might	be	reasonable	to	build	eye-tracking-enabled	apps	in	Unity	
rather	than	in	Processing	or	any	other	programming	language.	
While	eye	tracking	has	not	yet	become	a	mass	trends	it	was	logical	to	introduce	eye-tracker-enables	
interaction	in	a	form	of	art	installation.	iWizard	project	will	be	submitted	to	digital	arts	contest.	It	
has	been	already	offered	exibit	the	project	at	DigiLab	digital	hub	in	East	London.	
iWizard	concept	can	be	also	used	in	eye	care	centers,	offices	of	large	companies	as	an	eye	fitness	
machine.	If	interactions	are	designed	in	a	certain	way	that	makes	people	perform	specific	patterns	of	
eye	movement,	that	would	help	train	eye	muscles	in	a	funny	interactive	way	and	sustain	eye	health	
of	spectators.	
iWizard	is	a	successful	working	prototype	of	a	hybrid	creative	technology	product	that	uses	eye	
tracking	to	interact	with	physical	world.
Student	Name:	Maryna	Razakhatskaya	 Student	Number:	M00548643
	
18	
References	
1. Ancxt,	s.	(2016).	Metamorphy.	[online]	Scenocosme.com.	Available	at:	
http://www.scenocosme.com/metamorphy_e.htm	[Accessed	9	Oct.	2016].	
2. Andreasrefsgaard.dk.	(2016).	Eye	Conductor	|	Andreas	Refsgaard.	[online]	Available	at:	
http://andreasrefsgaard.dk/project/eye-conductor/	[Accessed	9	Oct.	2016].	
3. Biggs,	J.	(2016).	The	Eye	Tribe	Tracker	Pro	Offers	Affordable	Eye	Tracking	For	$199.	[online]	
TechCrunch.	Available	at:	https://techcrunch.com/2016/01/14/the-eye-tribe-tracker-pro-
offers-affordable-eye-tracking-for-199/	[Accessed	9	Oct.	2016].	
4. Blog.arduino.cc.	(2016).	Arduino	Blog	–	Control	with	your	smart	devices	by	staring	and	
gesturing.	[online]	Available	at:	https://blog.arduino.cc/2016/07/26/control-with-your-
smart-devices-by-staring-and-gesturing/	[Accessed	9	Oct.	2016].	
5. Cardoso,	J.	(2016).	Eye	Tribe	for	Processing.	[online]	Jorgecardoso.eu.	Available	at:	
http://jorgecardoso.eu/processing/eyetribeprocessing/	[Accessed	9	Oct.	2016].	
6. Catalyst	Frame.	(2016).	DIY	Hands-Free	Computer	Interface.	[online]	Available	at:	
http://www.catalystframe.com/hands-free-computer-interface/	[Accessed	9	Oct.	2016].	
7. Cha,	B.	(2012).	Brainwave-controlled	skateboard	is	totally	mental.	[online]	CNET.	Available	
at:	https://www.cnet.com/uk/news/brainwave-controlled-skateboard-is-totally-mental/	
[Accessed	9	Oct.	2016].	
8. Dreamstage.se.	(2016).	DreamStage.	[online]	Available	at:	http://dreamstage.se/	[Accessed	
9	Oct.	2016].	
9. Eyeproof.net.	(2016).	EyeProof	|	Analytics.	[online]	Available	at:	http://www.eyeproof.net/	
[Accessed	9	Oct.	2016].	
10. Eyewriter.org.	(2016).	EyeWriter.	[online]	Available	at:	http://www.eyewriter.org/	[Accessed	
9	Oct.	2016].	
11. Flong.com.	(2016).	Eyecode	-	Interactive	Art	by	Golan	Levin	and	Collaborators.	[online]	
Available	at:	http://www.flong.com/projects/eyecode/	[Accessed	9	Oct.	2016].	
12. Focusproject.co.uk.	(2016).	FAQs	-	The	Focus	Project.	[online]	Available	at:	
http://focusproject.co.uk/faqs/	[Accessed	9	Oct.	2016].	
13. INTRODUCING	EYE	TRACKING	IN	VIRTUAL	REALITY.	(2016).	1st	ed.	[ebook]	Copenhagen.	
Available	at:	https://theeyetribe.com/wp-content/uploads/2016/01/vr-product-sheet.pdf	
[Accessed	9	Oct.	2016].	
14. Lab212.org.	(2016).	Lab212.	[online]	Available	at:	http://lab212.org/Moc	[Accessed	9	Oct.	
2016].	
15. Openframeworks.cc.	(2016).	openFrameworks.	[online]	Available	at:	
http://openframeworks.cc/	[Accessed	9	Oct.	2016].	
16. PSFK.	(2013).	How	One	Artist	Paints	Using	Only	Her	Eyes	-	PSFK.	[online]	Available	at:	
http://www.psfk.com/2013/05/painting-eye-tracking-tobii-intel.html	[Accessed	9	Oct.	
2016].	
17. Steelseries.com.	(2016).	Sentry	Eye	Tracker	|	SteelSeries.	[online]	Available	at:	
https://steelseries.com/gaming-controllers/sentry	[Accessed	9	Oct.	2016].	
18. The	EyeWriter	DIY	Guide.	(2009).	1st	ed.	[ebook]	Q-Branch.	Available	at:	
http://fffff.at/eyewriter/The-EyeWriter.pdf	[Accessed	9	Oct.	2016].	
19. Theeyetribe.com.	(2016).	Products	–	The	Eye	Tribe.	[online]	Available	at:	
https://theeyetribe.com/products/	[Accessed	9	Oct.	2016].
Student	Name:	Maryna	Razakhatskaya	 Student	Number:	M00548643
	
19	
20. Tobii.com.	(2016).	This	is	eye	tracking.	[online]	Available	at:	
http://www.tobii.com/group/about/this-is-eye-tracking/	[Accessed	9	Oct.	2016].	
21. Tobii.com.	(2016).	Tobii	Apps	–	eye	tracking	enabled	games	and	apps.	[online]	Available	at:	
http://www.tobii.com/xperience/apps/	[Accessed	9	Oct.	2016].	
22. Tobii.com.	(2016).	Tobii	eye	tracking	painting.	[online]	Available	at:	
http://www.tobii.com/xperience/apps/eye-tracked-paintings/	[Accessed	9	Oct.	2016].	
23. Tobii.com.	(2016).	Tobii	EyeX	Controller	–	get	your	own	eye	tracker.	[online]	Available	at:	
http://www.tobii.com/xperience/products/	[Accessed	9	Oct.	2016].	
24. Yarbus,	A.	(1967).	Eye	Movements	and	Vision.	1st	ed.	[ebook]	New	York:	Plenum	Press.	
Available	at:	
http://wexler.free.fr/library/files/yarbus%20(1967)%20eye%20movements%20and%20visio
n.pdf	[Accessed	9	Oct.	2016].

More Related Content

Similar to iWizard_Report_MarynaRazakhatskaya

Steganography_ProjectReport.doc
Steganography_ProjectReport.docSteganography_ProjectReport.doc
Steganography_ProjectReport.docssusere02009
 
Android Based Facemask Detection system report.pdf
Android Based Facemask Detection system report.pdfAndroid Based Facemask Detection system report.pdf
Android Based Facemask Detection system report.pdfApuKumarGiri
 
AI Powered Helmet Detection for Enhanced Road Safety Thesis.pdf
AI Powered Helmet Detection for Enhanced Road Safety Thesis.pdfAI Powered Helmet Detection for Enhanced Road Safety Thesis.pdf
AI Powered Helmet Detection for Enhanced Road Safety Thesis.pdfABBUSINESS1
 
EST-Report[1] grp 15 (AutoRecovered).pdf
EST-Report[1] grp 15 (AutoRecovered).pdfEST-Report[1] grp 15 (AutoRecovered).pdf
EST-Report[1] grp 15 (AutoRecovered).pdfRenuDeshmukh5
 
Industrial_Training_PPT%20of%20Neha.pptx
Industrial_Training_PPT%20of%20Neha.pptxIndustrial_Training_PPT%20of%20Neha.pptx
Industrial_Training_PPT%20of%20Neha.pptx40NehaPagariya
 
NBA power point presentation final copy y
NBA power point presentation final copy yNBA power point presentation final copy y
NBA power point presentation final copy ysrajece
 
EST-Report[1] grp.pdf
EST-Report[1] grp.pdfEST-Report[1] grp.pdf
EST-Report[1] grp.pdfRenuDeshmukh5
 
MIRROR user profile concept
MIRROR user profile conceptMIRROR user profile concept
MIRROR user profile conceptAngela Fessl
 
Video interaction through finger tips
Video interaction through finger tips Video interaction through finger tips
Video interaction through finger tips Nithin Prince John
 
AMIZONER: Final Report
AMIZONER: Final ReportAMIZONER: Final Report
AMIZONER: Final ReportNeil Mathew
 
MINAL RAJESH GOKHALE finance
MINAL RAJESH GOKHALE financeMINAL RAJESH GOKHALE finance
MINAL RAJESH GOKHALE financeMinal Gokhale
 
Face detection and recognition report
Face detection and recognition reportFace detection and recognition report
Face detection and recognition reporthetvi naik
 
Jitin_Francis_CV....
Jitin_Francis_CV....Jitin_Francis_CV....
Jitin_Francis_CV....Jitin Francis
 
Presentation Alberto Barbero - MEDEAnet Webinar:Programming as creativity
Presentation Alberto Barbero - MEDEAnet Webinar:Programming as creativityPresentation Alberto Barbero - MEDEAnet Webinar:Programming as creativity
Presentation Alberto Barbero - MEDEAnet Webinar:Programming as creativityMEDEA Awards
 
Vipul V Nathani_Resume
Vipul V Nathani_ResumeVipul V Nathani_Resume
Vipul V Nathani_ResumeVipul Nathani
 
Sample projectdocumentation
Sample projectdocumentationSample projectdocumentation
Sample projectdocumentationhlksd
 

Similar to iWizard_Report_MarynaRazakhatskaya (20)

Steganography_ProjectReport.doc
Steganography_ProjectReport.docSteganography_ProjectReport.doc
Steganography_ProjectReport.doc
 
Android Based Facemask Detection system report.pdf
Android Based Facemask Detection system report.pdfAndroid Based Facemask Detection system report.pdf
Android Based Facemask Detection system report.pdf
 
AI Powered Helmet Detection for Enhanced Road Safety Thesis.pdf
AI Powered Helmet Detection for Enhanced Road Safety Thesis.pdfAI Powered Helmet Detection for Enhanced Road Safety Thesis.pdf
AI Powered Helmet Detection for Enhanced Road Safety Thesis.pdf
 
Certificate
CertificateCertificate
Certificate
 
EST-Report[1] grp 15 (AutoRecovered).pdf
EST-Report[1] grp 15 (AutoRecovered).pdfEST-Report[1] grp 15 (AutoRecovered).pdf
EST-Report[1] grp 15 (AutoRecovered).pdf
 
Face detection
Face detectionFace detection
Face detection
 
Industrial_Training_PPT%20of%20Neha.pptx
Industrial_Training_PPT%20of%20Neha.pptxIndustrial_Training_PPT%20of%20Neha.pptx
Industrial_Training_PPT%20of%20Neha.pptx
 
NBA power point presentation final copy y
NBA power point presentation final copy yNBA power point presentation final copy y
NBA power point presentation final copy y
 
EST-Report[1] grp.pdf
EST-Report[1] grp.pdfEST-Report[1] grp.pdf
EST-Report[1] grp.pdf
 
Personal Presentation
Personal PresentationPersonal Presentation
Personal Presentation
 
MIRROR user profile concept
MIRROR user profile conceptMIRROR user profile concept
MIRROR user profile concept
 
Video interaction through finger tips
Video interaction through finger tips Video interaction through finger tips
Video interaction through finger tips
 
AMIZONER: Final Report
AMIZONER: Final ReportAMIZONER: Final Report
AMIZONER: Final Report
 
MINAL RAJESH GOKHALE finance
MINAL RAJESH GOKHALE financeMINAL RAJESH GOKHALE finance
MINAL RAJESH GOKHALE finance
 
Face detection and recognition report
Face detection and recognition reportFace detection and recognition report
Face detection and recognition report
 
Jitin_Francis_CV....
Jitin_Francis_CV....Jitin_Francis_CV....
Jitin_Francis_CV....
 
Presentation Alberto Barbero - MEDEAnet Webinar:Programming as creativity
Presentation Alberto Barbero - MEDEAnet Webinar:Programming as creativityPresentation Alberto Barbero - MEDEAnet Webinar:Programming as creativity
Presentation Alberto Barbero - MEDEAnet Webinar:Programming as creativity
 
Vipul V Nathani_Resume
Vipul V Nathani_ResumeVipul V Nathani_Resume
Vipul V Nathani_Resume
 
Sample projectdocumentation
Sample projectdocumentationSample projectdocumentation
Sample projectdocumentation
 
ML PPT.pptx
ML PPT.pptxML PPT.pptx
ML PPT.pptx
 

iWizard_Report_MarynaRazakhatskaya