BREAKING THE BARRIERS TO
TRUE AUGMENTED REALITY
CHRISTIAN SANDOR
CHRISTIAN@SANDOR.COM
KEYNOTE AT 23RD INTERNATIONAL DISPLAY WORKSHOP
FUKUOKA, JAPAN
7 DECEMBER 2016
BURNAR: FEEL THE HEAT
MATT SWOBODA, THANH NGUYEN, ULRICH ECK, GERHARD REITMAYR, STEFAN HAUSWIESNER,
RENE RANFTL, AND CHRISTIAN SANDOR. DEMO AT IEEE INTERNATIONAL SYMPOSIUM ON MIXED
AND AUGMENTED REALITY, BASEL, SWITZERLAND, OCTOBER 2011. BEST DEMO AWARD
BURNAR: INVOLUNTARY HEAT SENSATIONS IN AR
PETER WEIR, CHRISTIAN SANDOR, MATT SWOBODA, THANH NGUYEN, ULRICH
ECK, GERHARD REITMAYR, AND ARINDAM DEY. PROCEEDINGS OF THE IEEE VIRTUAL
REALITY CONFERENCE, PAGES 43–46, ORLANDO, FL, USA, MARCH 2013.
WORKSHOP AT NAIST, AUGUST 2014
ARXIV E-PRINTS, ARXIV:1512.05471 [CS.HC], 13 PAGES
HTTP://ARXIV.ORG/ABS/1512.05471
DEFINITION:
1. UNDETECTABLE MODIFICATION OF USER’S PERCEPTION
2. GOAL: SEAMLESS BLEND OF REAL AND VIRTUAL WORLD
TRUE AR: WHAT?
HTTPS://EN.WIKIPEDIA.ORG/WIKI/
TURING_TEST
ALAN TURING. COMPUTING MACHINERY AND INTELLIGENCE. MIND, 59 (236): 433–
460, OCTOBER 1950.
INSPIRED BY ALAN TURING’S IMITATION GAME
PURPOSE: TEST QUALITY OF AI
RELATION TO OTHER TURING TESTS
COMPUTER GRAPHICS: MICHAEL D. MCGUIGAN.
GRAPHICS TURING TEST. ARXIV E-PRINTS, ARXIV:CS/
0603132V1, 2006
VISUAL COMPUTING: QI SHAN, RILEY ADAMS, BRIAN
CURLESS, YASUTAKA FURUKAWA, STEVEN M. SEITZ:
THE VISUAL TURING TEST FOR SCENE
RECONSTRUCTION. 3DV 2013: 25-32
VIRTUAL REALITY
AUGMENTED REALITY
DIFFICULTY
TRUE AR: WHY?
TRAINING: SPORTS & SKILLS
AMUSEMENT: INTERACTIVE STORIES
SCIENCE: PSYCHOLOGY & NEUROSCIENCE
LAW: FORENSICS & LOGISTICS OF CRIME SCENE
STAR TREK HOLODECK. HTTPS://EN.WIKIPEDIA.ORG/WIKI/HOLODECK
TRUE AR: HOW?
MANIPULATING
ATOMS
MANIPULATING
PERCEPTION
CONTROLLED
MATTER
PERSONALIZED AR IMPLANTED ARSURROUND
AR
There have been a number of shape displays based on pin
architecture. The FEELEX project [14] was one of the early
attempts to design combined shapes and computer graphics
displays that can be explored by touch. FEELEX consisted
of several mechanical pistons actuated by motors and cov-
ered by a soft silicon surface. The images were projected
onto its surface and synchronized with the movement of the
pistons, creating simple shapes.
Lumen [32] is a low resolution, 13 by 13-pixel, bit-map
display where each pixel can also physically move up and
down (Figure 4). The resulting display can present both 2D
graphic images and moving physical shapes that can be
observed, touched, and felt with the hands. The 2D position
sensor built into the surface of Lumen allows users to input
commands and manipulate shapes with their hands.
Other related project are PopUp and Glowbits devices [18,
33]. PopUp consists of an array of rods that can be moved
up and down using shape memory alloy actuators. The
PopUp, however, does not have a visual and interactive
component. Glowbits by Daniel Hirschmann (Figure 3) is a
2D array of rods with attached LEDs; the motorized rods
can move up and down and LEDs can change their colors.
Discussion
We have overviews a number of reasons why actuation can
be used in user interfaces. We summarize them in Table 1.
Applications Examples
Figure 2.7: Hand-fixed reference frame: Augmentations move w
example shows a user discussing a virtual map wit
map from di↵erent angles, he can pick it up from t
his belt and put it in his hand.
There have been a number of shape
architecture. The FEELEX project [14
attempts to design combined shapes a
displays that can be explored by touc
of several mechanical pistons actuate
ered by a soft silicon surface. The i
onto its surface and synchronized wit
pistons, creating simple shapes.
Lumen [32] is a low resolution, 13
display where each pixel can also ph
down (Figure 4). The resulting displa
graphic images and moving physica
observed, touched, and felt with the h
sensor built into the surface of Lumen
commands and manipulate shapes wit
Other related project are PopUp and
33]. PopUp consists of an array of ro
up and down using shape memory
PopUp, however, does not have a
component. Glowbits by Daniel Hirsc
2D array of rods with attached LED
can move up and down and LEDs can
Discussion
We have overviews a number of reas
be used in user interfaces. We summa
SACHIKO KODAMA. PROTRUDE, FLOW. ACM
SIGGRAPH 2001 ART GALLERY.
HTTP://PIXIEDUSTTECH.COM
CONTROLLED MATTER
HTTP://TANGIBLE.MEDIA.MIT.EDU/
PROJECT/INFORM
SURROUND VS. PERSONALIZED AR
MANIPULATING
ATOMS
MANIPULATING
PERCEPTION
CONTROLLED
MATTER
PERSONALIZED AR IMPLANTED ARSURROUND
AR
There have been a number of shape displays based on pin
architecture. The FEELEX project [14] was one of the early
attempts to design combined shapes and computer graphics
displays that can be explored by touch. FEELEX consisted
of several mechanical pistons actuated by motors and cov-
ered by a soft silicon surface. The images were projected
onto its surface and synchronized with the movement of the
pistons, creating simple shapes.
Lumen [32] is a low resolution, 13 by 13-pixel, bit-map
display where each pixel can also physically move up and
down (Figure 4). The resulting display can present both 2D
graphic images and moving physical shapes that can be
observed, touched, and felt with the hands. The 2D position
sensor built into the surface of Lumen allows users to input
commands and manipulate shapes with their hands.
Other related project are PopUp and Glowbits devices [18,
33]. PopUp consists of an array of rods that can be moved
up and down using shape memory alloy actuators. The
PopUp, however, does not have a visual and interactive
component. Glowbits by Daniel Hirschmann (Figure 3) is a
2D array of rods with attached LEDs; the motorized rods
can move up and down and LEDs can change their colors.
Discussion
We have overviews a number of reasons why actuation can
be used in user interfaces. We summarize them in Table 1.
Applications Examples
Figure 2.7: Hand-fixed reference frame: Augmentations move w
example shows a user discussing a virtual map wit
map from di↵erent angles, he can pick it up from t
his belt and put it in his hand.
LIGHT FIELD DISPLAYS:
PERCEIVABLE
SUBSET
FULL
onvergence of the eyes, and the distance of the
ated close to the horopter (see Section 4.4.1) can
ge around the horopter at which this is possible
rea. However, in addition to absolute disparity,
y-based depth perception. For example, the gra-
., the depth gradient) influence depth perception
depth perception content dependent. Further-
isparity is processed and depth is perceived can
conflicting cues (e.g., inconsistent convergence
ection 4.4.2) and nonconflicting cues, and an up-
modulation frequency of disparity exists. A good
ed depth perception can be found in [103].
onvergence and retinal disparity are the main
, there are others.
commodation and visual depth of field.
LIGHT FIELD DISPLAYS
WWW.DISPLAYSBOOK.INFO
VISION:
DISPLAY AS WINDOW
408 9. Three-Dimensional Disp
Figure 9.35. Light-field recording and reconstruction principle: light rays just pas
a window (left), light rays converted into pixel values on a tiny image senso
a pinhole camera (center), light rays reproduced by a tiny projector being jus
inverted pinhole camera (right).
a distance. In principle, this turns out to be quite simple. Any cam
with a su ciently small aperture will just record angles and intens
of incident light rays and map them onto the pixels of its image sen
(Figure 9.35). Hence small cameras of, for example, 1 mm in size an
su cient number of (in this case very tiny) pixels can deliver the light-fi
data for just one window segment, which we will call a pixel of the wind
Any camera can in general be seen as an angle-to-position converter. T
conversion is relatively robust with respect to geometric errors.
Reproducing the light field on a display is straightforward (at leas
theory): we could use identical optical assemblies, this time illumina
SENSOR
ARRAY
DISPLAY
ARRAY
120 4. Basics of Visual Perception
• Focus e↵ects (blurring of objects not in the lens focus)
• Haze (softened image parts appear more distant)
• Color (bluish objects appear more distant)
• Motion parallax (images change when the head moves)
• Motion dynamics (objects change sizes and positions, in motion)
Convergence. As explained already, convergence is the inward rotation of
the eyes when targeting a distant object (Figure 4.24). The state of the
eye muscles gives us a hint about depth for up to 10 meters. However, we
don’t get extremely fine angular resolutions at this distance.
Figure 4.24. Convergence (up to 10 m).
Retinal disparity. For longer distances, the di↵erence between the two im-
ages projected onto the retinas (called retinal disparity) is far more e cient
than convergence. Near objects block distant ones at slightly di↵erent po-
sitions, resulting in di↵erent images generated by the left and right eyes
(Figure 4.25).
VERGENCEACCOMMODATION
GOAL: NATURAL HUMAN VISUAL PERCEPTION
depth perception content dependent. Further-
isparity is processed and depth is perceived can
conflicting cues (e.g., inconsistent convergence
ection 4.4.2) and nonconflicting cues, and an up-
modulation frequency of disparity exists. A good
ed depth perception can be found in [103].
onvergence and retinal disparity are the main
, there are others.
commodation and visual depth of field.
PERSONALIZED AR:
A SMARTER APPROACH
• Focus e↵ects (blurring of objects not in the lens focus)
• Haze (softened image parts appear more distant)
• Color (bluish objects appear more distant)
• Motion parallax (images change when the head moves)
• Motion dynamics (objects change sizes and positions, in motion)
Convergence. As explained already, convergence is the inward rotation of
the eyes when targeting a distant object (Figure 4.24). The state of the
eye muscles gives us a hint about depth for up to 10 meters. However, we
don’t get extremely fine angular resolutions at this distance.
Figure 4.24. Convergence (up to 10 m).
Retinal disparity. For longer distances, the di↵erence between the two im-
ages projected onto the retinas (called retinal disparity) is far more e cient
than convergence. Near objects block distant ones at slightly di↵erent po-
sitions, resulting in di↵erent images generated by the left and right eyes
(Figure 4.25).
The di↵erences at object edges can be perceived up to the crispness
limit of our vision. With a typical eye-to-eye distance (also called interoc-
ular distance) of about six centimeters and an angular resolution of one
KEY IDEA: MEASURE HUMAN VISUAL SYSTEM & DISPLAY SUBSET OF
LIGHT FIELD
BENEFIT: REDUCE REQUIRED DISPLAY PIXELS BY SEVERAL ORDERS
OF MAGNITUDE
WILL BE ACHIEVED WELL BEFORE SURROUND AR!
VERGENCEACCOMMODATION
PHILOSOPHY: TRUE AUGMENTED REALITY
There have been a number of shape displays based on pin
architecture. The FEELEX project [14] was one of the early
attempts to design combined shapes and computer graphics
displays that can be explored by touch. FEELEX consisted
of several mechanical pistons actuated by motors and cov-
ered by a soft silicon surface. The images were projected
onto its surface and synchronized with the movement of the
pistons, creating simple shapes.
Lumen [32] is a low resolution, 13 by 13-pixel, bit-map
display where each pixel can also physically move up and
down (Figure 4). The resulting display can present both 2D
graphic images and moving physical shapes that can be
observed, touched, and felt with the hands. The 2D position
Figure 2.7: Hand-fixed reference frame: Augmentations move
example shows a user discussing a virtual map wit
map from di↵erent angles, he can pick it up from t
his belt and put it in his hand.
DISPLAYS
SharpView: Improved Clarity of Defocused C
See-Through Head-Mounted Dis
Kohei Oshima⇤ † Kenneth R Moser⇤ ‡ Damien Constantine Rompapas†
Goshiro Yamamoto† Takafumi Taketomi† Christian San
†Interactive Media Design Laboratory
Nara Institute of Science and Technology
‡Computer Science & Engineering
Mississippi State University
(a) (b) (c)
Figure 1: The cause and effect of focus blur in Optical See-Through (OST) Head-Mounted Displa
HMD and related hardware used in our study. (b) Simplified schematic of an OST AR system. B
and real world imagery are viewed at unequal focal distances. (c), (d), (e): Views through an O
world image (c) is in focus, causing the virtual image (d) to appear blurred; (e) an improved virtua
SharpView: Improved Clarity of Defocused Content on Optical
See-Through Head-Mounted Displays
Kohei Oshima⇤ † Kenneth R Moser⇤ ‡ Damien Constantine Rompapas† J. Edward Swan II‡ Sei Ikeda§
Goshiro Yamamoto† Takafumi Taketomi† Christian Sandor† Hirokazu Kato†
†Interactive Media Design Laboratory
Nara Institute of Science and Technology
‡Computer Science & Engineering
Mississippi State University
§Mobile Computing Laboratory
Ritsumeikan University
(a) (b) (c) (d) (e)
Figure 1: The cause and effect of focus blur in Optical See-Through (OST) Head-Mounted Display (HMD) systems. (a) A user wearing the OST
HMD and related hardware used in our study. (b) Simplified schematic of an OST AR system. Blurring occurs when the virtual display screen
and real world imagery are viewed at unequal focal distances. (c), (d), (e): Views through an OST Augmented Reality system, where the real
world image (c) is in focus, causing the virtual image (d) to appear blurred; (e) an improved virtual image after application of SharpView.
ABSTRACT
Augmented Reality (AR) systems, which utilize optical see-through
head-mounted displays, are becoming more common place, with
several consumer level options already available, and the promise of
additional, more advanced, devices on the horizon. A common fac-
tor among current generation optical see-through devices, though,
is fixed focal distance to virtual content. While fixed focus is not a
concern for video see-through AR, since both virtual and real world
imagery are combined into a single image by the display, unequal
distances between real world objects and the virtual display screen
in optical see-through AR is unavoidable.
In this work, we investigate the issue of focus blur, in particular,
the blurring caused by simultaneously viewing virtual content and
physical objects in the environment at differing focal distances. We
Multimedia Information Systems—Artificial, augmented, and vir-
tual realities; I.4.4 [Image Processing and Computer Vision]:
Restoration—Wiener filtering
1 INTRODUCTION
Optical See-Through (OST) Head-Mounted Displays (HMDs) have
seen an increase in both popularity and accessibility with the re-
lease of several consumer level options, including Google Glass
and Epson Moverio BT-200, and announced future offerings, such
as Microsoft’s HoloLens, on the horizon. The transparent display
technology used in these HMDs affords a unique experience, allow-
ing the user to view on-screen computer generated (CG) content
while maintaining a direct view of their environment, a property
extremely well suited for augmented reality (AR) systems. Un-
arpView: Improved Clarity of Defocused Content on Optical
See-Through Head-Mounted Displays
ma⇤ † Kenneth R Moser⇤ ‡ Damien Constantine Rompapas† J. Edward Swan II‡ Sei Ikeda§
Goshiro Yamamoto† Takafumi Taketomi† Christian Sandor† Hirokazu Kato†
active Media Design Laboratory
stitute of Science and Technology
‡Computer Science & Engineering
Mississippi State University
§Mobile Computing Laboratory
Ritsumeikan University
(b) (c) (d) (e)
use and effect of focus blur in Optical See-Through (OST) Head-Mounted Display (HMD) systems. (a) A user wearing the OST
hardware used in our study. (b) Simplified schematic of an OST AR system. Blurring occurs when the virtual display screen
magery are viewed at unequal focal distances. (c), (d), (e): Views through an OST Augmented Reality system, where the real
s in focus, causing the virtual image (d) to appear blurred; (e) an improved virtual image after application of SharpView.
ity (AR) systems, which utilize optical see-through
isplays, are becoming more common place, with
r level options already available, and the promise of
advanced, devices on the horizon. A common fac-
nt generation optical see-through devices, though,
Multimedia Information Systems—Artificial, augmented, and vir-
tual realities; I.4.4 [Image Processing and Computer Vision]:
Restoration—Wiener filtering
1 INTRODUCTION
Optical See-Through (OST) Head-Mounted Displays (HMDs) have
n of a Semi-Automatic Optical See-Through
nted Display Calibration Technique
E, Yuta Itoh, Student Member, IEEE, Kohei Oshima, Student Member, IEEE,
E, Gudrun Klinker, Member, IEEE, and Christian Sandor, Member, IEEE
. (a) Display and camera system. (b) Task layout. (c) Pillars task. (d) Cubes task.
of optical see-through (OST) head-mounted displays (HMDs), there is a present need for
bration methods suited for non-expert users. This work presents the results of a user study
mines registration accuracy produced by three OST HMD calibration methods: (1) SPAAM,
NDICA, a recently developed semi-automatic calibration method. Accuracy metrics used
ality values and error between perceived and absolute registration coordinates. Our results
e very accurate registration in the horizontal direction but caused subjects to perceive the
EEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, VOL. 21, NO. ,4 APRIL2015
GEOMETRIC
ALIGNMENT
REMOVE
BLUR
ARTIFACTS
CREATE
CORRECT
BLUR
GEOMETRIC ALIGNMENT: SPAAM
MIHRAN TUCERYAN, YAKUP GENC, AND NASSIR NAVAB. SINGLE-POINT ACTIVE
ALIGNMENT METHOD (SPAAM) FOR OPTICAL SEE-THROUGH HMD CALIBRATION
FOR AUGMENTED REALITY. PRESENCE: TELEOPERATORS AND VIRTUAL
ENVIRONMENTS, 11(3):259-276, JUNE 2002.
Screen Point
World Point
tH-P
Screen Pixel (x,y)
tH-P
Screen Pixel (x,y)
tH-P
Screen Pixel (x,y)
Fig. 9. Stages of the experimental procedure. Every subject performs
an initial SPAAM calibration followed by the recording of eye images
and performance of both tasks using the SPAAM results. The HMD is
removed and refit to the subject, eye images recorded once again, and
both tasks for one of the remaining conditions performed. The proce
ET AL.:MOSER SUBJ ECTIVE EVALUATION OF A SEMI-AUTOMATIC O
Algorithm
Qua 2.5
SPAAM DSPAAM INDICA SPAAM DSPAAM INDICA SPAAM DSPAAM INDICA
2.5
Fig. 10. Mean subjective quality values for each calibration method dur-
ing each task, normalized to a 1–4 scale with 1 denoting the lowest
quality and 4 the highest. The values shown are across subjects with
individual plots for the Pillars task as well as each grid of the Cubes task.
Cubes-V shows normalized quality for the vertical cubes grid. Cubes-H
shows normalized quality for the horizontal cubes grid. Means with the
same letter, within each plot, are not significantly different at p 0.05
(Ryan REGWQ post-hoc homogeneous subset test).
X (Left−Right) Error (cm), ± 1 SEM
Z(Front−Back)Error(cm),±1SEM
−3
−2
−1
0
−1 0 1
●
SPAAM
Pillars
−1 0 1
●
DSPAAM
Pillars
−1 0 1
−3
−2
−1
0
●
INDICA
Pillars
A"
B"B"
Fig. 11. Mean Pillars task error along the X (Left-Right) and Z (Front-
Back) direction relative to the tracking coordinate frame. 0 indicates
no error. Error is reported as a distance value, with every 4 cm of er-
ror equating to a 1 pillar location difference in the respective direction.
Means with the same letter are not significantly different at p 0.05
(Ryan REGWQ post-hoc homogeneous subset test).
X (Left−Right) Error (cm)
Y(Up−Down)Err
−3
−2
−1 0 1
●
●
B"B"
Fig. 12. Mean vertical cubes grid task er
and X (Left-Right) direction relative to the
0 indicates no error. Error in each directio
value, with every 2 cm of error equating
difference in the respective direction. Mean
significantly different at p 0.05 (Ryan REG
subset test).
X (Left−Right) Error (cm)
Z(Front−Back)Error(cm),±1SEM
−5
−4
−3
−2
−1 0 1
●
SPAAM
Cubes−H
−1 0
●
DSPAAM
Cubes−H
B"
B"
Fig. 13. Mean horizontal cubes grid task er
and X (Left-Right) direction relative to the
0 indicates no error. Error in each directio
Fig. 1. Experimental hardware and design. (a) Display and camera system. (b) Task layout. (c) Pillars task. (
Abstract— With the growing availability of optical see-through (OST) head-mounted displays (HMDs), ther
robust, uncomplicated, and automatic calibration methods suited for non-expert users. This work presents th
which both objectively and subjectively examines registration accuracy produced by three OST HMD calibratio
(2) Degraded SPAAM, and (3) Recycled INDICA, a recently developed semi-automatic calibration method.
for evaluation include subject provided quality values and error between perceived and absolute registration c
show all three calibration methods produce very accurate registration in the horizontal direction but caused
distance of virtual objects to be closer than intended. Surprisingly, the semi-automatic calibration method p
registration vertically and in perceived object distance overall. User assessed quality values were also th
INDICA, particularly when objects were shown at distance. The results of this study confirm that Recycle
producing equal or superior on-screen registration compared to common OST HMD calibration methods. We
hazard in using reprojection error as a quantitative analysis technique to predict registration accuracy. We c
the further need for examining INDICA calibration in binocular HMD systems, and the present possibility for c
continuous calibration method for OST Augmented Reality.
Index Terms—Calibration, user study, OST HMD, INDICA, SPAAM, eye tracking
KENNETH MOSER, YUTA ITOH, KOHEI OSHIMA, EDWARD SWAN, GUDRUN KLINKER, AND
CHRISTIAN SANDOR. SUBJECTIVE EVALUATION OF A SEMI-AUTOMATIC OPTICAL SEE-
THROUGH HEAD-MOUNTED DISPLAY CALIBRATION TECHNIQUE. IEEE TRANSACTIONS
ON VISUALIZATION AND COMPUTER GRAPHICS, 21(4):491–500, MARCH 2015.
OUR METHOD: ONLY SPAAM ONCE
BLUR ARTIFACTS
DESIRED MOST DISPLAYS
ved Clarity of Defocused Content on O
hrough Head-Mounted Displays
r⇤ ‡ Damien Constantine Rompapas† J. Edward Swa
Takafumi Taketomi† Christian Sandor† Hirokazu K
y
ogy
‡Computer Science & Engineering
Mississippi State University
§Mobile Computing L
Ritsumeikan Univ
(b) (c)
ect of focus blur in Optical See-Through (OST) Head-Mounted Displa
used in our study. (b) Simplified schematic of an OST AR system. B
viewed at unequal focal distances. (c), (d), (e): Views through an O
ausing the virtual image (d) to appear blurred; (e) an improved virtua
REAL PHOTO
“MATCHING” IMAGE
Christian Sandor Hirokazu Kato
& Engineering
e University
§Mobile Computing Laboratory
Ritsumeikan University
(c) (d) (e)
Head-Mounted Display (HMD) systems. (a) A user wearing the OST
an OST AR system. Blurring occurs when the virtual display screen
): Views through an OST Augmented Reality system, where the real
(e) an improved virtual image after application of SharpView.
KOHEI OSHIMA, KENNETH R MOSER, DAMIEN CONSTANTINE ROMPAPAS, J EDWARD
SWAN II, SEI IKEDA, GOSHIRO YAMAMOTO, TAKAFUMI TAKETOMI, CHRISTIAN SANDOR,
AND HIROKAZU KATO. IMPROVED CLARITY OF DEFOCUSSED CONTENT ON OPTICAL
SEE-THROUGH HEAD-MOUNTED DISPLAYS. IN IEEE SYMPOSIUM ON 3D USER
INTERFACES, PAGES 173–181, GREENVILLE, SOUTH CAROLINA, USA, MARCH 2016.
OUR METHOD: SHARPVIEW
SHARPVIEWREAL PHOTO
& Engineering
University
§Mobile Computing Laboratory
Ritsumeikan University
(c) (d) (e)
Head-Mounted Display (HMD) systems. (a) A user wearing the OST
an OST AR system. Blurring occurs when the virtual display screen
: Views through an OST Augmented Reality system, where the rea
(e) an improved virtual image after application of SharpView.
& Engineering
University
§Mobile Computing Laboratory
Ritsumeikan University
(c) (d) (e)
Head-Mounted Display (HMD) systems. (a) A user wearing the O
an OST AR system. Blurring occurs when the virtual display scre
: Views through an OST Augmented Reality system, where the r
& Engineering
University
§Mobile Computing Laboratory
Ritsumeikan University
(c) (d) (e)
Head-Mounted Display (HMD) systems. (a) A user wearing the OST
an OST AR system. Blurring occurs when the virtual display screen
: Views through an OST Augmented Reality system, where the rea
(e) an improved virtual image after application of SharpView.
“MATCHING” IMAGE
ESTIMATING EYE PSF
ng the Wiener filter, adjusted
o rendered images O and dis-
to the HMD.
P
|C|2
(4)
ocus blur, caused by accom-
play screen and world in OST
mined at run-time. We accom-
ssian function to approximate
te rates but with a reduction in
y modeling the intensity of the
screen, intersecting varying
ment of the pupil advisable. Additional system complexity m
Figure 2: Optical system formed by the user’s eye and an OST HM
The imaging plane corresponds to the user’s retina and the lens ap
ture to the user’s pupil.
expressed as sd, the ratio between the eye’s image p
is expressed as follows.
s : sd = v : u0
Here, sd is directly obtainable from equations (6) a
sd =
a
2
(1
u0
u
)
where a is pupil diameter, u is distance from the
world gaze point, and u0 represents the distance fro
image plane. When performing the actual convolut
filter and screen image, generally, sd may be con
ina. The intensity distribution, p, can be repre-
owing function.
P(x,y) =
1
2ps2
exp(
x2 +y2
2s2
) (5)
Gaussian function
!
SimplifiedGaussian functionto approximate the
decreasesprocessingtime allowingfasterupdate
OUR EXPERIMENTtion.
(a) (b)
Figure 5: Location of subjects relative to reference images placed at
25 cm (a) and 500 cm (b) from the subjects’ eyes.
capable of presenting stereo imagery at 60Hz with a maximum res-
olution of 960⇥540 per eye. The focal distance of the display was
both independently measured and confirmed by the manufacturer
MATCHING BLUR: REAL & VIRTUAL
DAMIEN CONSTANTINE ROMPAPAS, AITOR ROVIRA, SEI IKEDA, ALEXANDER
PLOPSKI, TAKAFUMI TAKETOMI, CHRISTIAN SANDOR, AND HIROKAZU KATO.
EYEAR: REFOCUSABLE AUGMENTED REALITY CONTENT THROUGH EYE
MEASUREMENTS. DEMO AT IEEE INTERNATIONAL SYMPOSIUM ON MIXED AND
AUGMENTED REALITY, MERIDA, MEXICO, SEPTEMBER 2016. BEST DEMO AWARD
users can focus their eyes on any part of the scene, and the CG will always reflect th
of the user’s eye (See Figure 1.7 for an example).
Figure 1.5: Example of a user looking into the box enclosure. Left: Without EyeA
can observe the DoF mismatch between CG (white hat) and real scene (dragon).
With EyeAR, the CG’s DoF accurately matches the natural DoF of the real scene.
Because EyeAR is able to create accurate DoF images on OST-HMD display, al
HMDs should include this functionality. However, the applications of EyeAR a
limited to creating indistinguishable AR content as our system directly measures t
For example, Sharpview (Oshima et al., 2015) sharpens content displayed on the
by approximating the user’s eye point spread function based on the user’s eye pu
Typical AR on OST-HMD scene with the user focusing on the objects in
r objects in front there is a DoF mismatch between CG (hat) and real scene
ghlighted with the white circle.
1
OUR DISPLAYMOST DISPLAYS
OUR FIRST AR TURING TEST
0.40.50.60.70.8
Virtual Pillar
Correctguesses(%)
Green (0.25m) Blue (0.375m) Red (0.5m)
Autorefractometer
on
off
Figure 7: Overall percentage of correct guesses
for each pillar when the autorefractometer was on
(red line) and off (blue line). text in figure too
small. some text crossing boundaries
12 PARTICIPANTS
12 GUESSES
VIRTUAL
REAL
DISPLAYSSharpView: Improved Clarity of Defocused C
See-Through Head-Mounted Dis
Kohei Oshima⇤ † Kenneth R Moser⇤ ‡ Damien Constantine Rompapas†
Goshiro Yamamoto† Takafumi Taketomi† Christian San
†Interactive Media Design Laboratory
Nara Institute of Science and Technology
‡Computer Science & Engineering
Mississippi State University
(a) (b) (c)
Figure 1: The cause and effect of focus blur in Optical See-Through (OST) Head-Mounted Displa
HMD and related hardware used in our study. (b) Simplified schematic of an OST AR system. B
and real world imagery are viewed at unequal focal distances. (c), (d), (e): Views through an O
world image (c) is in focus, causing the virtual image (d) to appear blurred; (e) an improved virtua
SharpView: Improved Clarity of Defocused Content on Optical
See-Through Head-Mounted Displays
Kohei Oshima⇤ † Kenneth R Moser⇤ ‡ Damien Constantine Rompapas† J. Edward Swan II‡ Sei Ikeda§
Goshiro Yamamoto† Takafumi Taketomi† Christian Sandor† Hirokazu Kato†
†Interactive Media Design Laboratory
Nara Institute of Science and Technology
‡Computer Science & Engineering
Mississippi State University
§Mobile Computing Laboratory
Ritsumeikan University
(a) (b) (c) (d) (e)
Figure 1: The cause and effect of focus blur in Optical See-Through (OST) Head-Mounted Display (HMD) systems. (a) A user wearing the OST
HMD and related hardware used in our study. (b) Simplified schematic of an OST AR system. Blurring occurs when the virtual display screen
and real world imagery are viewed at unequal focal distances. (c), (d), (e): Views through an OST Augmented Reality system, where the real
world image (c) is in focus, causing the virtual image (d) to appear blurred; (e) an improved virtual image after application of SharpView.
ABSTRACT
Augmented Reality (AR) systems, which utilize optical see-through
head-mounted displays, are becoming more common place, with
several consumer level options already available, and the promise of
additional, more advanced, devices on the horizon. A common fac-
tor among current generation optical see-through devices, though,
is fixed focal distance to virtual content. While fixed focus is not a
concern for video see-through AR, since both virtual and real world
imagery are combined into a single image by the display, unequal
distances between real world objects and the virtual display screen
in optical see-through AR is unavoidable.
In this work, we investigate the issue of focus blur, in particular,
the blurring caused by simultaneously viewing virtual content and
physical objects in the environment at differing focal distances. We
Multimedia Information Systems—Artificial, augmented, and vir-
tual realities; I.4.4 [Image Processing and Computer Vision]:
Restoration—Wiener filtering
1 INTRODUCTION
Optical See-Through (OST) Head-Mounted Displays (HMDs) have
seen an increase in both popularity and accessibility with the re-
lease of several consumer level options, including Google Glass
and Epson Moverio BT-200, and announced future offerings, such
as Microsoft’s HoloLens, on the horizon. The transparent display
technology used in these HMDs affords a unique experience, allow-
ing the user to view on-screen computer generated (CG) content
while maintaining a direct view of their environment, a property
extremely well suited for augmented reality (AR) systems. Un-
arpView: Improved Clarity of Defocused Content on Optical
See-Through Head-Mounted Displays
ma⇤ † Kenneth R Moser⇤ ‡ Damien Constantine Rompapas† J. Edward Swan II‡ Sei Ikeda§
Goshiro Yamamoto† Takafumi Taketomi† Christian Sandor† Hirokazu Kato†
active Media Design Laboratory
stitute of Science and Technology
‡Computer Science & Engineering
Mississippi State University
§Mobile Computing Laboratory
Ritsumeikan University
(b) (c) (d) (e)
use and effect of focus blur in Optical See-Through (OST) Head-Mounted Display (HMD) systems. (a) A user wearing the OST
hardware used in our study. (b) Simplified schematic of an OST AR system. Blurring occurs when the virtual display screen
magery are viewed at unequal focal distances. (c), (d), (e): Views through an OST Augmented Reality system, where the real
s in focus, causing the virtual image (d) to appear blurred; (e) an improved virtual image after application of SharpView.
ity (AR) systems, which utilize optical see-through
isplays, are becoming more common place, with
r level options already available, and the promise of
advanced, devices on the horizon. A common fac-
nt generation optical see-through devices, though,
Multimedia Information Systems—Artificial, augmented, and vir-
tual realities; I.4.4 [Image Processing and Computer Vision]:
Restoration—Wiener filtering
1 INTRODUCTION
Optical See-Through (OST) Head-Mounted Displays (HMDs) have
n of a Semi-Automatic Optical See-Through
nted Display Calibration Technique
E, Yuta Itoh, Student Member, IEEE, Kohei Oshima, Student Member, IEEE,
E, Gudrun Klinker, Member, IEEE, and Christian Sandor, Member, IEEE
. (a) Display and camera system. (b) Task layout. (c) Pillars task. (d) Cubes task.
of optical see-through (OST) head-mounted displays (HMDs), there is a present need for
bration methods suited for non-expert users. This work presents the results of a user study
mines registration accuracy produced by three OST HMD calibration methods: (1) SPAAM,
NDICA, a recently developed semi-automatic calibration method. Accuracy metrics used
ality values and error between perceived and absolute registration coordinates. Our results
e very accurate registration in the horizontal direction but caused subjects to perceive the
EEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, VOL. 21, NO. ,4 APRIL2015
PHILOSOPHY: TRUE AUGMENTED REALITY
There have been a number of shape displays based on pin
architecture. The FEELEX project [14] was one of the early
attempts to design combined shapes and computer graphics
displays that can be explored by touch. FEELEX consisted
of several mechanical pistons actuated by motors and cov-
ered by a soft silicon surface. The images were projected
onto its surface and synchronized with the movement of the
pistons, creating simple shapes.
Lumen [32] is a low resolution, 13 by 13-pixel, bit-map
display where each pixel can also physically move up and
down (Figure 4). The resulting display can present both 2D
graphic images and moving physical shapes that can be
observed, touched, and felt with the hands. The 2D position
Figure 2.7: Hand-fixed reference frame: Augmentations move
example shows a user discussing a virtual map wit
map from di↵erent angles, he can pick it up from t
his belt and put it in his hand.
RESEARCH IN CANON
CHRISTIAN SANDOR, TSUYOSHI KUROKI, AND SHINJI UCHIYAMA. INFORMATION
PROCESSING METHOD AND DEVICE FOR PRESENTING HAPTICS RECEIVED FROM A
VIRTUAL OBJECT. JAPANESE PATENT 2006117732 (FILED 4/2006). PATENT IN CHINA,
EUROPE, AND US 8,378,997 (FILED 19 APRIL 2007). HTTP://GOO.GL/V3DAX
RESEARCH IN CANON
CHRISTIAN SANDOR, SHINJI UCHIYAMA, AND HIROYUKI YAMAMOTO. VISUO-
HAPTIC SYSTEMS: HALF-MIRRORS CONSIDERED HARMFUL. IN PROCEEDINGS OF
THE IEEE WORLD HAPTICS CONFERENCE, PAGES 292–297. IEEE, MARCH 2007.
TSUKUBA, JAPAN.
DISPLAYS APPLICATIONSSharpView: Improved Clarity of Defocused C
See-Through Head-Mounted Dis
Kohei Oshima⇤ † Kenneth R Moser⇤ ‡ Damien Constantine Rompapas†
Goshiro Yamamoto† Takafumi Taketomi† Christian San
†Interactive Media Design Laboratory
Nara Institute of Science and Technology
‡Computer Science & Engineering
Mississippi State University
(a) (b) (c)
Figure 1: The cause and effect of focus blur in Optical See-Through (OST) Head-Mounted Displa
HMD and related hardware used in our study. (b) Simplified schematic of an OST AR system. B
and real world imagery are viewed at unequal focal distances. (c), (d), (e): Views through an O
world image (c) is in focus, causing the virtual image (d) to appear blurred; (e) an improved virtua
SharpView: Improved Clarity of Defocused Content on Optical
See-Through Head-Mounted Displays
Kohei Oshima⇤ † Kenneth R Moser⇤ ‡ Damien Constantine Rompapas† J. Edward Swan II‡ Sei Ikeda§
Goshiro Yamamoto† Takafumi Taketomi† Christian Sandor† Hirokazu Kato†
†Interactive Media Design Laboratory
Nara Institute of Science and Technology
‡Computer Science & Engineering
Mississippi State University
§Mobile Computing Laboratory
Ritsumeikan University
(a) (b) (c) (d) (e)
Figure 1: The cause and effect of focus blur in Optical See-Through (OST) Head-Mounted Display (HMD) systems. (a) A user wearing the OST
HMD and related hardware used in our study. (b) Simplified schematic of an OST AR system. Blurring occurs when the virtual display screen
and real world imagery are viewed at unequal focal distances. (c), (d), (e): Views through an OST Augmented Reality system, where the real
world image (c) is in focus, causing the virtual image (d) to appear blurred; (e) an improved virtual image after application of SharpView.
ABSTRACT
Augmented Reality (AR) systems, which utilize optical see-through
head-mounted displays, are becoming more common place, with
several consumer level options already available, and the promise of
additional, more advanced, devices on the horizon. A common fac-
tor among current generation optical see-through devices, though,
is fixed focal distance to virtual content. While fixed focus is not a
concern for video see-through AR, since both virtual and real world
imagery are combined into a single image by the display, unequal
distances between real world objects and the virtual display screen
in optical see-through AR is unavoidable.
In this work, we investigate the issue of focus blur, in particular,
the blurring caused by simultaneously viewing virtual content and
physical objects in the environment at differing focal distances. We
Multimedia Information Systems—Artificial, augmented, and vir-
tual realities; I.4.4 [Image Processing and Computer Vision]:
Restoration—Wiener filtering
1 INTRODUCTION
Optical See-Through (OST) Head-Mounted Displays (HMDs) have
seen an increase in both popularity and accessibility with the re-
lease of several consumer level options, including Google Glass
and Epson Moverio BT-200, and announced future offerings, such
as Microsoft’s HoloLens, on the horizon. The transparent display
technology used in these HMDs affords a unique experience, allow-
ing the user to view on-screen computer generated (CG) content
while maintaining a direct view of their environment, a property
extremely well suited for augmented reality (AR) systems. Un-
arpView: Improved Clarity of Defocused Content on Optical
See-Through Head-Mounted Displays
ma⇤ † Kenneth R Moser⇤ ‡ Damien Constantine Rompapas† J. Edward Swan II‡ Sei Ikeda§
Goshiro Yamamoto† Takafumi Taketomi† Christian Sandor† Hirokazu Kato†
active Media Design Laboratory
stitute of Science and Technology
‡Computer Science & Engineering
Mississippi State University
§Mobile Computing Laboratory
Ritsumeikan University
(b) (c) (d) (e)
use and effect of focus blur in Optical See-Through (OST) Head-Mounted Display (HMD) systems. (a) A user wearing the OST
hardware used in our study. (b) Simplified schematic of an OST AR system. Blurring occurs when the virtual display screen
magery are viewed at unequal focal distances. (c), (d), (e): Views through an OST Augmented Reality system, where the real
s in focus, causing the virtual image (d) to appear blurred; (e) an improved virtual image after application of SharpView.
ity (AR) systems, which utilize optical see-through
isplays, are becoming more common place, with
r level options already available, and the promise of
advanced, devices on the horizon. A common fac-
nt generation optical see-through devices, though,
Multimedia Information Systems—Artificial, augmented, and vir-
tual realities; I.4.4 [Image Processing and Computer Vision]:
Restoration—Wiener filtering
1 INTRODUCTION
Optical See-Through (OST) Head-Mounted Displays (HMDs) have
n of a Semi-Automatic Optical See-Through
nted Display Calibration Technique
E, Yuta Itoh, Student Member, IEEE, Kohei Oshima, Student Member, IEEE,
E, Gudrun Klinker, Member, IEEE, and Christian Sandor, Member, IEEE
. (a) Display and camera system. (b) Task layout. (c) Pillars task. (d) Cubes task.
of optical see-through (OST) head-mounted displays (HMDs), there is a present need for
bration methods suited for non-expert users. This work presents the results of a user study
mines registration accuracy produced by three OST HMD calibration methods: (1) SPAAM,
NDICA, a recently developed semi-automatic calibration method. Accuracy metrics used
ality values and error between perceived and absolute registration coordinates. Our results
e very accurate registration in the horizontal direction but caused subjects to perceive the
EEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, VOL. 21, NO. ,4 APRIL2015
PHILOSOPHY: TRUE AUGMENTED REALITY
There have been a number of shape displays based on pin
architecture. The FEELEX project [14] was one of the early
attempts to design combined shapes and computer graphics
displays that can be explored by touch. FEELEX consisted
of several mechanical pistons actuated by motors and cov-
ered by a soft silicon surface. The images were projected
onto its surface and synchronized with the movement of the
pistons, creating simple shapes.
Lumen [32] is a low resolution, 13 by 13-pixel, bit-map
display where each pixel can also physically move up and
down (Figure 4). The resulting display can present both 2D
graphic images and moving physical shapes that can be
observed, touched, and felt with the hands. The 2D position
Figure 2.7: Hand-fixed reference frame: Augmentations move
example shows a user discussing a virtual map wit
map from di↵erent angles, he can pick it up from t
his belt and put it in his hand.
EDGE-BASED X-RAY
BENJAMIN AVERY, CHRISTIAN SANDOR, BRUCE H. THOMAS. IMPROVING SPATIAL
PERCEPTION FOR AUGMENTED REALITY X-RAY VISION. IN PROCEEDINGS OF THE IEEE VIRTUAL
REALITY CONFERENCE, PAGES 79–82. IEEE, MARCH 2009. LAFAYETTE, LOUISIANA, USA.
SALIENCY X-RAY
CHRISTIAN SANDOR, ANDREW CUNNINGHAM, ARINDAM DEY, AND VILLE-VEIKKO
MATTILA. AN AUGMENTED REALITY X-RAY SYSTEM BASED ON VISUAL SALIENCY. IN
PROCEEDINGS OF THE IEEE INTERNATIONAL SYMPOSIUM ON MIXED AND
AUGMENTED REALITY, PAGES 27–36, SEOUL, KOREA, OCTOBER 2010.
SALIENCY X-RAY
CHRISTIAN SANDOR, ANDREW CUNNINGHAM, AND MATTILA VILLE-VEIKKO.
METHOD AND APPARATUS FOR AN AUGMENTED REALITY X-RAY. US PATENT
APPLICATION 12/785,170 (FILED 21 MAY 2010). HTTP://GOO.GL/NCVZJ
MELTING
CHRISTIAN SANDOR, ANDREW CUNNINGHAM, ULRICH ECK, DONALD URQUHART, GRAEME JARVIS,
ARINDAM DEY, SEBASTIEN BARBIER, MICHAEL R. MARNER, SANG RHEE. EGOCENTRIC SPACE-DISTORTING
VISUALIZATIONS FOR RAPID ENVIRONMENT EXPLORATION IN MOBILE MIXED REALITY. IN PROCEEDINGS
OF THE IEEE VIRTUAL REALITY CONFERENCE, PAGES 47–50, WALTHAM, MA, USA, MARCH 2010.
Rehabilitation & Sports Medicine
Frozen Shoulder
SHOULDER - 26
Range of Motion Exercises:
Pendulum (Circular)
Let arm move in a circle
clockwise, then counter-
clockwise, by rocking body
weight in a circular pattern.
Repeat 10 times.
Do 3-5 sessions per day.
SHOULDER - 7
Range of Motion Exercises
(Self-Stretching Activities):
Flexion
Sitting upright, slide forearm
forward along table, bending
from waist until a stretch is
felt. Hold 30 seconds.
Repeat 1-4 times
Do 1 session per day.
SHOULDER - 11
Range of Motion Exercises
(Self-Stretching Activities):
External Rotation (alternate)
Keep palm of hand against
door frame, and elbow bent at
90°. Turn body from fixed
hand until a stretch is felt.
Hold 30 seconds.
Repeat 1-4 times
Do 1 session per day.
SHOULDER - 9
Range of Motion Exercises (Self-
Stretching Activities): Abduction
With arm resting on table, palm up, bring
head down toward arm and simultaneously
move trunk away from table. Hold 30
seconds.
Repeat 1-4 times Do 1 session per day.
SHOULDER - 73
Towel Stretch for Internal
Rotation
Pull involved arm up
behind back by pulling
towel upward with other
arm. Hold 30 seconds.
Repeat 1-4 times
Do 1 session per day.
SCAP SETS
Pull your shoulders back,
pinching the shoulder
blades together. Do not let
the shoulders come
forward. Hold 5-10
seconds.
Repeat 10 times
Do 1 session per day.
FUTURE WORK: MEDICAL APPLICATIONS
FUTURE WORK: MEDICAL APPLICATIONS
COURTESY OF HTTP://CAMPAR.IN.TUM.DE/MAIN/FELIXBORK
DISPLAYS APPLICATIONSSharpView: Improved Clarity of Defocused C
See-Through Head-Mounted Dis
Kohei Oshima⇤ † Kenneth R Moser⇤ ‡ Damien Constantine Rompapas†
Goshiro Yamamoto† Takafumi Taketomi† Christian San
†Interactive Media Design Laboratory
Nara Institute of Science and Technology
‡Computer Science & Engineering
Mississippi State University
(a) (b) (c)
Figure 1: The cause and effect of focus blur in Optical See-Through (OST) Head-Mounted Displa
HMD and related hardware used in our study. (b) Simplified schematic of an OST AR system. B
and real world imagery are viewed at unequal focal distances. (c), (d), (e): Views through an O
world image (c) is in focus, causing the virtual image (d) to appear blurred; (e) an improved virtua
SharpView: Improved Clarity of Defocused Content on Optical
See-Through Head-Mounted Displays
Kohei Oshima⇤ † Kenneth R Moser⇤ ‡ Damien Constantine Rompapas† J. Edward Swan II‡ Sei Ikeda§
Goshiro Yamamoto† Takafumi Taketomi† Christian Sandor† Hirokazu Kato†
†Interactive Media Design Laboratory
Nara Institute of Science and Technology
‡Computer Science & Engineering
Mississippi State University
§Mobile Computing Laboratory
Ritsumeikan University
(a) (b) (c) (d) (e)
Figure 1: The cause and effect of focus blur in Optical See-Through (OST) Head-Mounted Display (HMD) systems. (a) A user wearing the OST
HMD and related hardware used in our study. (b) Simplified schematic of an OST AR system. Blurring occurs when the virtual display screen
and real world imagery are viewed at unequal focal distances. (c), (d), (e): Views through an OST Augmented Reality system, where the real
world image (c) is in focus, causing the virtual image (d) to appear blurred; (e) an improved virtual image after application of SharpView.
ABSTRACT
Augmented Reality (AR) systems, which utilize optical see-through
head-mounted displays, are becoming more common place, with
several consumer level options already available, and the promise of
additional, more advanced, devices on the horizon. A common fac-
tor among current generation optical see-through devices, though,
is fixed focal distance to virtual content. While fixed focus is not a
concern for video see-through AR, since both virtual and real world
imagery are combined into a single image by the display, unequal
distances between real world objects and the virtual display screen
in optical see-through AR is unavoidable.
In this work, we investigate the issue of focus blur, in particular,
the blurring caused by simultaneously viewing virtual content and
physical objects in the environment at differing focal distances. We
Multimedia Information Systems—Artificial, augmented, and vir-
tual realities; I.4.4 [Image Processing and Computer Vision]:
Restoration—Wiener filtering
1 INTRODUCTION
Optical See-Through (OST) Head-Mounted Displays (HMDs) have
seen an increase in both popularity and accessibility with the re-
lease of several consumer level options, including Google Glass
and Epson Moverio BT-200, and announced future offerings, such
as Microsoft’s HoloLens, on the horizon. The transparent display
technology used in these HMDs affords a unique experience, allow-
ing the user to view on-screen computer generated (CG) content
while maintaining a direct view of their environment, a property
extremely well suited for augmented reality (AR) systems. Un-
arpView: Improved Clarity of Defocused Content on Optical
See-Through Head-Mounted Displays
ma⇤ † Kenneth R Moser⇤ ‡ Damien Constantine Rompapas† J. Edward Swan II‡ Sei Ikeda§
Goshiro Yamamoto† Takafumi Taketomi† Christian Sandor† Hirokazu Kato†
active Media Design Laboratory
stitute of Science and Technology
‡Computer Science & Engineering
Mississippi State University
§Mobile Computing Laboratory
Ritsumeikan University
(b) (c) (d) (e)
use and effect of focus blur in Optical See-Through (OST) Head-Mounted Display (HMD) systems. (a) A user wearing the OST
hardware used in our study. (b) Simplified schematic of an OST AR system. Blurring occurs when the virtual display screen
magery are viewed at unequal focal distances. (c), (d), (e): Views through an OST Augmented Reality system, where the real
s in focus, causing the virtual image (d) to appear blurred; (e) an improved virtual image after application of SharpView.
ity (AR) systems, which utilize optical see-through
isplays, are becoming more common place, with
r level options already available, and the promise of
advanced, devices on the horizon. A common fac-
nt generation optical see-through devices, though,
Multimedia Information Systems—Artificial, augmented, and vir-
tual realities; I.4.4 [Image Processing and Computer Vision]:
Restoration—Wiener filtering
1 INTRODUCTION
Optical See-Through (OST) Head-Mounted Displays (HMDs) have
n of a Semi-Automatic Optical See-Through
nted Display Calibration Technique
E, Yuta Itoh, Student Member, IEEE, Kohei Oshima, Student Member, IEEE,
E, Gudrun Klinker, Member, IEEE, and Christian Sandor, Member, IEEE
. (a) Display and camera system. (b) Task layout. (c) Pillars task. (d) Cubes task.
of optical see-through (OST) head-mounted displays (HMDs), there is a present need for
bration methods suited for non-expert users. This work presents the results of a user study
mines registration accuracy produced by three OST HMD calibration methods: (1) SPAAM,
NDICA, a recently developed semi-automatic calibration method. Accuracy metrics used
ality values and error between perceived and absolute registration coordinates. Our results
e very accurate registration in the horizontal direction but caused subjects to perceive the
EEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, VOL. 21, NO. ,4 APRIL2015
PHILOSOPHY: TRUE AUGMENTED REALITY
There have been a number of shape displays based on pin
architecture. The FEELEX project [14] was one of the early
attempts to design combined shapes and computer graphics
displays that can be explored by touch. FEELEX consisted
of several mechanical pistons actuated by motors and cov-
ered by a soft silicon surface. The images were projected
onto its surface and synchronized with the movement of the
pistons, creating simple shapes.
Lumen [32] is a low resolution, 13 by 13-pixel, bit-map
display where each pixel can also physically move up and
down (Figure 4). The resulting display can present both 2D
graphic images and moving physical shapes that can be
observed, touched, and felt with the hands. The 2D position
Figure 2.7: Hand-fixed reference frame: Augmentations move
example shows a user discussing a virtual map wit
map from di↵erent angles, he can pick it up from t
his belt and put it in his hand.
PHILOSOPHY: TRUE AUGMENTED REALITY
There have been a number of shape displays based on pin
architecture. The FEELEX project [14] was one of the early
attempts to design combined shapes and computer graphics
displays that can be explored by touch. FEELEX consisted
of several mechanical pistons actuated by motors and cov-
ered by a soft silicon surface. The images were projected
onto its surface and synchronized with the movement of the
pistons, creating simple shapes.
Lumen [32] is a low resolution, 13 by 13-pixel, bit-map
display where each pixel can also physically move up and
down (Figure 4). The resulting display can present both 2D
graphic images and moving physical shapes that can be
observed, touched, and felt with the hands. The 2D position
Figure 2.7: Hand-fixed reference frame: Augmentations move
example shows a user discussing a virtual map wit
map from di↵erent angles, he can pick it up from t
his belt and put it in his hand.
DISPLAYS APPLICATIONSSharpView: Improved Clarity of Defocused C
See-Through Head-Mounted Dis
Kohei Oshima⇤ † Kenneth R Moser⇤ ‡ Damien Constantine Rompapas†
Goshiro Yamamoto† Takafumi Taketomi† Christian San
†Interactive Media Design Laboratory
Nara Institute of Science and Technology
‡Computer Science & Engineering
Mississippi State University
(a) (b) (c)
Figure 1: The cause and effect of focus blur in Optical See-Through (OST) Head-Mounted Displa
HMD and related hardware used in our study. (b) Simplified schematic of an OST AR system. B
and real world imagery are viewed at unequal focal distances. (c), (d), (e): Views through an O
world image (c) is in focus, causing the virtual image (d) to appear blurred; (e) an improved virtua
SharpView: Improved Clarity of Defocused Content on Optical
See-Through Head-Mounted Displays
Kohei Oshima⇤ † Kenneth R Moser⇤ ‡ Damien Constantine Rompapas† J. Edward Swan II‡ Sei Ikeda§
Goshiro Yamamoto† Takafumi Taketomi† Christian Sandor† Hirokazu Kato†
†Interactive Media Design Laboratory
Nara Institute of Science and Technology
‡Computer Science & Engineering
Mississippi State University
§Mobile Computing Laboratory
Ritsumeikan University
(a) (b) (c) (d) (e)
Figure 1: The cause and effect of focus blur in Optical See-Through (OST) Head-Mounted Display (HMD) systems. (a) A user wearing the OST
HMD and related hardware used in our study. (b) Simplified schematic of an OST AR system. Blurring occurs when the virtual display screen
and real world imagery are viewed at unequal focal distances. (c), (d), (e): Views through an OST Augmented Reality system, where the real
world image (c) is in focus, causing the virtual image (d) to appear blurred; (e) an improved virtual image after application of SharpView.
ABSTRACT
Augmented Reality (AR) systems, which utilize optical see-through
head-mounted displays, are becoming more common place, with
several consumer level options already available, and the promise of
additional, more advanced, devices on the horizon. A common fac-
tor among current generation optical see-through devices, though,
is fixed focal distance to virtual content. While fixed focus is not a
concern for video see-through AR, since both virtual and real world
imagery are combined into a single image by the display, unequal
distances between real world objects and the virtual display screen
in optical see-through AR is unavoidable.
In this work, we investigate the issue of focus blur, in particular,
the blurring caused by simultaneously viewing virtual content and
physical objects in the environment at differing focal distances. We
Multimedia Information Systems—Artificial, augmented, and vir-
tual realities; I.4.4 [Image Processing and Computer Vision]:
Restoration—Wiener filtering
1 INTRODUCTION
Optical See-Through (OST) Head-Mounted Displays (HMDs) have
seen an increase in both popularity and accessibility with the re-
lease of several consumer level options, including Google Glass
and Epson Moverio BT-200, and announced future offerings, such
as Microsoft’s HoloLens, on the horizon. The transparent display
technology used in these HMDs affords a unique experience, allow-
ing the user to view on-screen computer generated (CG) content
while maintaining a direct view of their environment, a property
extremely well suited for augmented reality (AR) systems. Un-
arpView: Improved Clarity of Defocused Content on Optical
See-Through Head-Mounted Displays
ma⇤ † Kenneth R Moser⇤ ‡ Damien Constantine Rompapas† J. Edward Swan II‡ Sei Ikeda§
Goshiro Yamamoto† Takafumi Taketomi† Christian Sandor† Hirokazu Kato†
active Media Design Laboratory
stitute of Science and Technology
‡Computer Science & Engineering
Mississippi State University
§Mobile Computing Laboratory
Ritsumeikan University
(b) (c) (d) (e)
use and effect of focus blur in Optical See-Through (OST) Head-Mounted Display (HMD) systems. (a) A user wearing the OST
hardware used in our study. (b) Simplified schematic of an OST AR system. Blurring occurs when the virtual display screen
magery are viewed at unequal focal distances. (c), (d), (e): Views through an OST Augmented Reality system, where the real
s in focus, causing the virtual image (d) to appear blurred; (e) an improved virtual image after application of SharpView.
ity (AR) systems, which utilize optical see-through
isplays, are becoming more common place, with
r level options already available, and the promise of
advanced, devices on the horizon. A common fac-
nt generation optical see-through devices, though,
Multimedia Information Systems—Artificial, augmented, and vir-
tual realities; I.4.4 [Image Processing and Computer Vision]:
Restoration—Wiener filtering
1 INTRODUCTION
Optical See-Through (OST) Head-Mounted Displays (HMDs) have
n of a Semi-Automatic Optical See-Through
nted Display Calibration Technique
E, Yuta Itoh, Student Member, IEEE, Kohei Oshima, Student Member, IEEE,
E, Gudrun Klinker, Member, IEEE, and Christian Sandor, Member, IEEE
. (a) Display and camera system. (b) Task layout. (c) Pillars task. (d) Cubes task.
of optical see-through (OST) head-mounted displays (HMDs), there is a present need for
bration methods suited for non-expert users. This work presents the results of a user study
mines registration accuracy produced by three OST HMD calibration methods: (1) SPAAM,
NDICA, a recently developed semi-automatic calibration method. Accuracy metrics used
ality values and error between perceived and absolute registration coordinates. Our results
e very accurate registration in the horizontal direction but caused subjects to perceive the
EEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, VOL. 21, NO. ,4 APRIL2015CONCLUSIONS
SUMMARY
AR: EXTREMELY HIGH POTENTIAL (UNLIKE VR)
INTERDISCIPLINARY: COMPUTER GRAPHICS, COMPUTER VISION,
OPTICS, PERCEPTION RESEARCH
REQUEST
CHAT TO ME AT IDW! LOOKING FOR GOOD COLLABORATORS
CHRISTIAN@SANDOR.COM
SLIDES WILL BE ONLINE WITHIN ONE HOUR!
HTTP://WWW.SLIDESHARE.NET/CHRISTIANSANDOR