Most smart glasses have a small and limited field of view. The head-mounted display often spreads between the human central and peripheral vision. In this paper, we exploit this characteristic to display information in the peripheral vision of the user. We introduce a mobile peripheral vision model, which can be used on any smart glasses with a head-mounted display without any additional hardware requirement. This model taps into the blocked peripheral vision of a user and simplifies multi-tasking when using smart glasses. To display the potential applications of this model, we implement an application for indoor and outdoor navigation. We conduct an experiment on 20 people on both smartphone and smart glass to evaluate our model on indoor and outdoor conditions. Users report to have spent at least 50% less time looking at the screen by exploiting their peripheral vision with smart glass. 90% of the users Agree that using the model for navigation is more practical than standard navigation applications.
Eye Tracking Accuracy Comparison Test
SMI RED 250 vs.GazeFlow WebCam EyeTracker
Download Demo!
GazeRecorder: WebCam Eye Tracking for usability testing :
https://gazerecorder.com
GazePointer:
Control mouse cursor position with your eyes via webcam.
http://www.sourceforge.net/projects/gazepointer
GazeInsight Cloud eye tracking online
https://app.gazerecorder.com
GazeCloud API
https://api.gazerecorder.com
In the simplest terms, eye tracking is the measurement of eye activity. Where do we look? When do we blink? How does the pupil react to different stimuli? The concept is basic, but the process and interpretation can be quite complex. There are many different methods of exploring eye data.The most common is to analyze the visual path of one or more participants across an interface such as a computer screen. Each eye data observation is translated into a set of pixel coordinates.
User Interfaces and User Centered Design Techniques for Augmented Reality and...Stuart Murphy
We chose to explore virtual and augmented reality (VR and AR) due to its recent emergence into the mainstream areas of gaming, mobile applications and various other systems. We felt it important to distinguish between VR and AR in both areas of interaction design and user interface evaluation and creation techniques. As it is a topic of great passion for us we wanted to instill the possibilities that this medium has to offer for interaction designers and UI developers.
Eye Tracking Accuracy Comparison Test
SMI RED 250 vs.GazeFlow WebCam EyeTracker
Download Demo!
GazeRecorder: WebCam Eye Tracking for usability testing :
https://gazerecorder.com
GazePointer:
Control mouse cursor position with your eyes via webcam.
http://www.sourceforge.net/projects/gazepointer
GazeInsight Cloud eye tracking online
https://app.gazerecorder.com
GazeCloud API
https://api.gazerecorder.com
In the simplest terms, eye tracking is the measurement of eye activity. Where do we look? When do we blink? How does the pupil react to different stimuli? The concept is basic, but the process and interpretation can be quite complex. There are many different methods of exploring eye data.The most common is to analyze the visual path of one or more participants across an interface such as a computer screen. Each eye data observation is translated into a set of pixel coordinates.
User Interfaces and User Centered Design Techniques for Augmented Reality and...Stuart Murphy
We chose to explore virtual and augmented reality (VR and AR) due to its recent emergence into the mainstream areas of gaming, mobile applications and various other systems. We felt it important to distinguish between VR and AR in both areas of interaction design and user interface evaluation and creation techniques. As it is a topic of great passion for us we wanted to instill the possibilities that this medium has to offer for interaction designers and UI developers.
Eye tracking system has played a significant role in many of today’s applications ranging from military
applications to automotive industries and healthcare sectors. In this paper, a novel system for eye tracking and
estimation of its direction of movement is performed. The proposed system is implemented in real time using an
arduino uno microcontroller and a zigbee wireless device. Experimental results show a successful eye tracking and
movement estimation in real time scenario using the proposed hardware interface.
Losing the normal vision is the common problem
facing by human beings in this present world. These problems
occur when the image is not properly focused on retina. These
problems are usually corrected by using spectacles or contact
lens. To test the eye sight of the patient in the present system, we
have manual testing and computerized or Tablet based eye
testing. By using any of these techniques eye sight of the patient is
determined. In these type of traditional methods users are just
choosing the spectacles that is of stylish and suits them, even if
wearing of such spectacles is of no use to them. So once as it is
found that whether the patient is having the eye sight or not, even
if there is no eye sight and patient is interested to wear the
spectacles to protect his eyes while reading or watching the
screen then there should be an approach using which we can
suggest the person with customized progressive lens by
monitoring their observations whether a person is moving his
head or just eye to see object or screen.
This project is done by Digital Image processing and Computer
Vision based techniques and algorithms in a practical approach.
The main objective of this project is to design algorithm for eye
and head movement tracking device. Firstly that device is made
learnt about what does eye looks like and where it is located on
face by using some eye and head movement tracking algorithms.
Then patient is made to sit comfortably in the chair in front of
that device, such that the patient’s eyes are visible from the
camera and sensors view and he is made to wear some trial frame
which emits radiation from LED’s and then patient is allowed to
observe the LED light that are mounted on the screen and they
are designed to glow in some specific pattern. The motion of the
patient’s eye and head while observing the LED pattern is
continuously monitored by camera and sensors that are present
on the device and information is stored and processed. Based on
this information Eye Movements Region map is generated. Once
the eye and head movements region map is generated they are
combined to form a generalized region map. This map is then
used to suggest the patient with the customized progressive lens.
An idea of intuitive mobile diopter calculator for myopia patientTELKOMNIKA JOURNAL
The diopter is the unit of measurement for the refractive power of a lens. Myopia is a form of refractive error which is a leading cause of visual disability throughout the world. The prevailing treatment of refractive errors which are commonly used in daily life are glasses and contact lenses. Although those methods can overcome myopia, many myopia patients still don’t really know how to measure their current refractive error in diopter. This condition may retard the progression of refractive treatment in the myopic individual. The common methods to measure refractive error are phoropter with Snellen chart and retinoscopy, but those expensive tools need expertise to operate. This paper presents the concept of measure the face to smartphone screen distance to provide the possibility to implement a mobile application as a low-cost alternative refractive measurement tool. The main objective is to investigate the feasibility of mobile application to help patients with myopia measuring their blur line distances and evaluate their diopter levels independently. The experimental results reveal that, with 80.5 usability score the overall functionality of proposed application can be categorized as usable to users and feasible for future implementation.
Assistive System Using Eye Gaze Estimation for Amyotrophic Lateral Sclerosis ...Editor IJCATR
Amyotrophic lateral sclerosis (ALS) patients cannot control their muscle except eyes in the later stage of the disease
progress. This paper aims to develop an eye-based assistive system that is controlled by the eye gaze to help ALS patients improve
their life quality. Two main functions are proposed in this paper. The first one is called HelpCall that can detect the users’ eye gaze to
active the corresponding events. ALS patients can “talk” with other people more easily by looking at specific buttons in the HelpCall
system. The second one is an eye-control browser that allows the users browsing web pages in Internet. We design an interface that
embeds the IE browser into several buttons controlled by the user eye gaze. ALS patients can visit the Internet only using their eyes in
our proposed system. This paper discusses our ideas for the assistive system and then describes the design and implementation of our
proposed system in details.
Design of gaussian spatial filter to determine the amount of refraction error...eSAT Publishing House
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology.
Eye(I) Still Know! – An App for the Blind Built using Web and AIDr. Amarjeet Singh
This paper proposes eye(I) still know!, a voice control solution for the visually impaired people. The main purpose is even though the blind cannot see they can still know where to go and what to do! Nearby 60% of total blind population across the world is present in India. In a time where no one likes to rely on anyone, this is a small effort to make the blind independent individuals. This can be achieved using wireless communication, voice recognition and image scanning. The application with the use of object identification will priorly inform about the barriers in the path.
The software will use the camera of the device and scan all the obstacles with their corresponding distances from the user. This will be followed by audio instructions through audio output of the device.
This will efficiently direct the user through his/her way.
Travel Assistant For Blind With Dynamic User Input For Location Based Alerts ...WizApsProjects
The goal of the Blind Aid project is to develop navigational assistance technology for the blind or visually impaired. Specifically, we seek to develop a portable Electronic Travel Aid (ETA) for visually impaired users, along with the accompanying radio frequency identification (RFID) localization infrastructure used to equip buildings.
The major problem the visually impaired experience is trouble with indoor navigation in unfamiliar buildings. Imagine the wide open spaces in an airport concourse; even if there are Braille signs at the counters, the blind may not be able to find them.
Blind people use moving ramps, escalators, revolving doors, and turnstiles with as much ease as any sighted person. When encountering stairs, it simply isn't necessary to explore each step. The cane tip is used primarily to detect the beginnings and ends of stairways. In either case, it is a good idea to hold the cane in a manner that would not trip a fellow pedestrian.
One of the major goals for blind and visually impaired people is independent mobility. Mobility aids like walking sticks and guide dogs are commonly used by the blind even today. With the advances of modern technologies, many different types of devices are available to support the mobility of the blind. These mobility aids are generally known as electronic travel aid.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
More Related Content
Similar to Peripheral Vision: A New Killer App for Smart Glasses
Eye tracking system has played a significant role in many of today’s applications ranging from military
applications to automotive industries and healthcare sectors. In this paper, a novel system for eye tracking and
estimation of its direction of movement is performed. The proposed system is implemented in real time using an
arduino uno microcontroller and a zigbee wireless device. Experimental results show a successful eye tracking and
movement estimation in real time scenario using the proposed hardware interface.
Losing the normal vision is the common problem
facing by human beings in this present world. These problems
occur when the image is not properly focused on retina. These
problems are usually corrected by using spectacles or contact
lens. To test the eye sight of the patient in the present system, we
have manual testing and computerized or Tablet based eye
testing. By using any of these techniques eye sight of the patient is
determined. In these type of traditional methods users are just
choosing the spectacles that is of stylish and suits them, even if
wearing of such spectacles is of no use to them. So once as it is
found that whether the patient is having the eye sight or not, even
if there is no eye sight and patient is interested to wear the
spectacles to protect his eyes while reading or watching the
screen then there should be an approach using which we can
suggest the person with customized progressive lens by
monitoring their observations whether a person is moving his
head or just eye to see object or screen.
This project is done by Digital Image processing and Computer
Vision based techniques and algorithms in a practical approach.
The main objective of this project is to design algorithm for eye
and head movement tracking device. Firstly that device is made
learnt about what does eye looks like and where it is located on
face by using some eye and head movement tracking algorithms.
Then patient is made to sit comfortably in the chair in front of
that device, such that the patient’s eyes are visible from the
camera and sensors view and he is made to wear some trial frame
which emits radiation from LED’s and then patient is allowed to
observe the LED light that are mounted on the screen and they
are designed to glow in some specific pattern. The motion of the
patient’s eye and head while observing the LED pattern is
continuously monitored by camera and sensors that are present
on the device and information is stored and processed. Based on
this information Eye Movements Region map is generated. Once
the eye and head movements region map is generated they are
combined to form a generalized region map. This map is then
used to suggest the patient with the customized progressive lens.
An idea of intuitive mobile diopter calculator for myopia patientTELKOMNIKA JOURNAL
The diopter is the unit of measurement for the refractive power of a lens. Myopia is a form of refractive error which is a leading cause of visual disability throughout the world. The prevailing treatment of refractive errors which are commonly used in daily life are glasses and contact lenses. Although those methods can overcome myopia, many myopia patients still don’t really know how to measure their current refractive error in diopter. This condition may retard the progression of refractive treatment in the myopic individual. The common methods to measure refractive error are phoropter with Snellen chart and retinoscopy, but those expensive tools need expertise to operate. This paper presents the concept of measure the face to smartphone screen distance to provide the possibility to implement a mobile application as a low-cost alternative refractive measurement tool. The main objective is to investigate the feasibility of mobile application to help patients with myopia measuring their blur line distances and evaluate their diopter levels independently. The experimental results reveal that, with 80.5 usability score the overall functionality of proposed application can be categorized as usable to users and feasible for future implementation.
Assistive System Using Eye Gaze Estimation for Amyotrophic Lateral Sclerosis ...Editor IJCATR
Amyotrophic lateral sclerosis (ALS) patients cannot control their muscle except eyes in the later stage of the disease
progress. This paper aims to develop an eye-based assistive system that is controlled by the eye gaze to help ALS patients improve
their life quality. Two main functions are proposed in this paper. The first one is called HelpCall that can detect the users’ eye gaze to
active the corresponding events. ALS patients can “talk” with other people more easily by looking at specific buttons in the HelpCall
system. The second one is an eye-control browser that allows the users browsing web pages in Internet. We design an interface that
embeds the IE browser into several buttons controlled by the user eye gaze. ALS patients can visit the Internet only using their eyes in
our proposed system. This paper discusses our ideas for the assistive system and then describes the design and implementation of our
proposed system in details.
Design of gaussian spatial filter to determine the amount of refraction error...eSAT Publishing House
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology.
Eye(I) Still Know! – An App for the Blind Built using Web and AIDr. Amarjeet Singh
This paper proposes eye(I) still know!, a voice control solution for the visually impaired people. The main purpose is even though the blind cannot see they can still know where to go and what to do! Nearby 60% of total blind population across the world is present in India. In a time where no one likes to rely on anyone, this is a small effort to make the blind independent individuals. This can be achieved using wireless communication, voice recognition and image scanning. The application with the use of object identification will priorly inform about the barriers in the path.
The software will use the camera of the device and scan all the obstacles with their corresponding distances from the user. This will be followed by audio instructions through audio output of the device.
This will efficiently direct the user through his/her way.
Travel Assistant For Blind With Dynamic User Input For Location Based Alerts ...WizApsProjects
The goal of the Blind Aid project is to develop navigational assistance technology for the blind or visually impaired. Specifically, we seek to develop a portable Electronic Travel Aid (ETA) for visually impaired users, along with the accompanying radio frequency identification (RFID) localization infrastructure used to equip buildings.
The major problem the visually impaired experience is trouble with indoor navigation in unfamiliar buildings. Imagine the wide open spaces in an airport concourse; even if there are Braille signs at the counters, the blind may not be able to find them.
Blind people use moving ramps, escalators, revolving doors, and turnstiles with as much ease as any sighted person. When encountering stairs, it simply isn't necessary to explore each step. The cane tip is used primarily to detect the beginnings and ends of stairways. In either case, it is a good idea to hold the cane in a manner that would not trip a fellow pedestrian.
One of the major goals for blind and visually impaired people is independent mobility. Mobility aids like walking sticks and guide dogs are commonly used by the blind even today. With the advances of modern technologies, many different types of devices are available to support the mobility of the blind. These mobility aids are generally known as electronic travel aid.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Designing Great Products: The Power of Design and Leadership by Chief Designe...
Peripheral Vision: A New Killer App for Smart Glasses
1. Peripheral Vision: A New Killer App for Smart Glasses
Isha Chaturvedi
Hong Kong University of Science & Technology
Farshid Hassani Bijarbooneh
Hong Kong University of Science & Technology
Tristan Braud
Hong Kong University of Science & Technology
Pan Hui
University of Helsinki
Hong Kong University of Science & Technology
HKUST-DT System and Media Lab
2. We use this model to build a
navigation application for smart
glasses for both indoors and
outdoors scenarios.
We introduce a Mobile Peripheral Vision (MPV) model which can
be used on smart glasses with a head-mounted display without
any additional hardware requirement & simplifies multi-tasking
when using the glasses.
3. Angular Field of View of Human Eye
30° 60° 90°
central
near-peripheral
paracentral
far
peripheral
far
peripheral
mid-peripheral
Angular Feld of view (AFOV or
AOV) measures the angular
extent of a 360-degree circle that
is visible by the human eye.
The foveal system, responsible for
the foveal vision, lies within the
central and para-central area.
The area outside the foveal
system is responsible for the
peripheral vision.
4. Angular Field of View of Google Glass
The AFOV of Google Glass is
approximately 30 degrees which is
significantly smaller than the AFOV
of the human eye.
This limited FOV forces the user to
direct his central eye gaze towards
the small screen of the glass to
extract meaningful information.
AFOV~30°
8. What is been done so far?
Enhancing the FOV of smart-glasses
Exploring peripheral vision
Navigation on smart-glasses
9. How we differ from existing models?
Mobile Peripheral Vision
(MPV) Model
Near-Eye-Out-Of-Focus
Displays
Software Solution, Light (10 mm * 10mm) &
Simple
Hardware, Large screen, Bulky Setup (60
mm*60mm) & Complex
Mobile users Static users
Variable Luminosity & Backgrounds – both for
indoors and outdoors
Luminosity is fixed
Information is displayed within central & near
peripheral area of the Normal Human Eye
Display is located at Far Peripheral Area
11. Contributions
Present an MPV Model using color and motion to display visual cues in
the peripheral vision of the user
Develop a high-fidelity peripheral vision-based navigation application
for both indoor and outdoor environment scenarios
Discuss two specific cases, namely strabismus and color-blindness, for
which our MPV model does not apply
13. Combination of 2 Theories
Color Detection in the Peripheral Area of the eye
Motion Detection through Peripheral Vision
14. 2 Types of Cells in the Normal Human Eye
Cones Cells – responsible for the color
Rod Cells – responsible for motion & night detection (at the periphery)
15. Color Sensitivity Zones for the Normal Human Eye
blue, it becomes physiologically impossible to experience
such combinations as reddish green or yellowish blue.
More common among those with color deficiencies are
individuals whose response functions to the photo-
Figure 6. The zones of color sensitivity
270 4 -
Mb j/
225
pOj 40 P
vx
m it
0
J T1_^" y^>
'4oJ_
/To
[ 50
---JKVBLUE'
Y E L L O W ^ ^
vnT^T—Wfc-
JS^GREENLWJ
t-VR E D
^X>
180
45
! WO I 80
90
^" 1
35
16. Use of 3 Colors to Signal Information to the User
- Highest number of cones in the periphery
- High number of cones in the retina center
- High contrast with the other 2 cones
17. 3 Fundamental Concepts
Motion detection by the peripheral vision using simple shapes
Presence of blue cones at the periphery
Abundance of red and green cones at the center of the retina
19. Basic Navigation Application using MPV Model
A blue color dot blinks on the left sign to indicate a point of interest on the left
(a). Once the user turns left, the entire screen turns blue to indicate that the user
is in the correct direction (b)
21. MPV based Demo Application
Indoor Application – using public Wi-Fi networks
(of less than -90 dB)
Outdoor Application – using GPS (precision of
at most 5 meters)
Location Detection in the App
WAP 1 WAP 3
WAP 4 WAP 5
WAP 2
23. MPV based Demo Application – User is Notified
to turn Right
Turn Right
The navigation hint bars stimulate peripheral vision through motion. The bars
movement covers the entire 30 deg of angular FOV of Google Glass.
24. MPV based Demo Application – User is Notified
to turn Right
Turn Right
The bars on the head-mounted display occupy 25% of the screen width. Each time the
bar appears, it blinks 4 times for 2 s from left to the right of the screen at a velocity of
15 deg/s.
25. MPV based Demo Application – User is Notified
to turn Right
Turn Right
This ensures that the velocity is far above the peripheral vision motion detection
threshold value of 2.15 deg/s at 90 degrees eccentricity.
26. MPV based Demo Application – User is Notified
to turn Right
Turn Right
Considering that the AFOV of Google Glass is 30 deg and that the bars are moving
at the rate of 15 deg/s, the total periphery stimulation time is 2 s
27. MPV based Demo Application – User is Notified
to turn Right
Turn Left
This ensures that the user receives the hint in a reasonable time to react as the
visual reaction time to rapid movements is at least 250ms.
28. MPV based Demo Application – User is Notified
to turn Right
Turn Left
29. MPV based Demo Application – User is Notified
to turn Right
Turn Left
30. MPV based Demo Application – User is Notified
to turn Right
Turn Left
The cue is activated within a radius of 3 meters from each turn.
31. MPV based Demo Application – 2 Tasks
Primary Task – Looking at Points of Interest
Secondary Task – Navigate through the path
32. Points of Interest (POI) – Primary Task
POI for Outdoor ApplicationPOI for Indoor Application
40. Experiment Results
Indoor - The probability that the means of the two paired-sample data is
different is at least 84%, 91%, and 74% for the mental demand, physical
demand, and frustration level respectively.
Outdoor - The probability that the means of the two paired-sample data is
different is at least 86%, 92%, and 71% respectively.
41. Experiment Results
The users experience a low mental and physical demand when navigating
using our MPV App.
Users save 50% more time for their main activity using their peripheral
vision with smart glasses.
90% of the users Agree that using their peripheral vision for the navigation
application is more beneficial and efficient than looking into the screen.
42. Special Cases
Color Blindness – Achromatopsia (No Color Detection)
Strabismus - misalignment of the eyes when
looking at an object
43. Future Works
Extend our work and experiment on different scenarios such as
augmented reality and virtual reality games, etc.
Enhance our model with eye tracking features
Expand our panel of users to people suffering from
strabismus and color-blindness to precise the evaluation of
such conditions on our model.
44. Contributions – Repeat!!
Present an MPV Model using color and motion to display visual cues in
the peripheral vision of the user
Develop a high-fidelity peripheral vision-based navigation application
for both indoor and outdoor environment scenarios
Discuss two specific cases, namely strabismus and color-blindness, for
which our MPV model does not apply
45. Acknowledgements
The authors thank the anonymous reviewers for their in-sightful comments.
This research has been supported, inpart, by projects 26211515, 16214817, and
G-HKUST604/16 from the Research Grants Council of Hong Kong, as well as the
5GEAR project from the Academy of Finland ICT 2023 programme.