Computational near-eye displays aim to address the vergence-accommodation conflict in virtual reality by dynamically adjusting focus based on a user's gaze. Studies show that adaptive focus displays can improve visual clarity and comfort for most users compared to conventional VR displays. However, presbyopic users over age 40 may still experience some visual issues due to reduced ability to accommodate. Future near-eye displays could explore technologies like multiplane displays and light field displays to better support users of all ages.
Inspired by Wheatstone’s original stereoscope and augmenting it with modern factored light field synthesis, we present a new near-eye display technology that supports focus cues. These cues are critical for mitigating visual discomfort experienced in commercially-available head mounted displays and providing comfortable, long-term immersive experiences.
Millions of people worldwide need glasses or contact lenses to see or read properly. We introduce a computational display technology that predistorts the presented content for an observer, so that the target image is perceived without the need for eyewear. We demonstrate a low-cost prototype that can correct myopia, hyperopia, astigmatism, and even higher-order aberrations that are difficult to correct with glasses.
Tailored Displays to Compensate for Visual Aberrations - SIGGRAPH PresentationVitor Pamplona
Can we create a display that adapts itself to improve one's eyesight? Top figure compares the view of a 2.5-diopter farsighted individual in regular and tailored displays. We use currently available inexpensive technologies to warp light fields to compensate for refractive errors and scattering sites in the eye.
Inspired by Wheatstone’s original stereoscope and augmenting it with modern factored light field synthesis, we present a new near-eye display technology that supports focus cues. These cues are critical for mitigating visual discomfort experienced in commercially-available head mounted displays and providing comfortable, long-term immersive experiences.
Millions of people worldwide need glasses or contact lenses to see or read properly. We introduce a computational display technology that predistorts the presented content for an observer, so that the target image is perceived without the need for eyewear. We demonstrate a low-cost prototype that can correct myopia, hyperopia, astigmatism, and even higher-order aberrations that are difficult to correct with glasses.
Tailored Displays to Compensate for Visual Aberrations - SIGGRAPH PresentationVitor Pamplona
Can we create a display that adapts itself to improve one's eyesight? Top figure compares the view of a 2.5-diopter farsighted individual in regular and tailored displays. We use currently available inexpensive technologies to warp light fields to compensate for refractive errors and scattering sites in the eye.
A new gaze-contingent rendering mode for VR/AR that renders in perceptually correct ocular parallax which benefits depth perception and perceptual realism.
Head Mounted Displays: How to realize ultimate AR experiences? YutaItoh
Slides from a talk I gave at:
Wissens-Austausch-Workshop: The 3rd Workshop on Visualisierung großer Datenmengen in der Wissenschaft (VisDa3? WAW), DLR@Oberpfaffenhofen, Germany, Jun. 9 - 10, 2015.
For more detail of my work, see:
http://campar.in.tum.de/Main/YutaItoh
This ppt contains all the details of Stereoscopic imaging. It includes from history, introduction, its working technique, 3D viewers, 3D cameras, future scope, advantages, disadvantages. In all, its the complete stuff that can satisfy anyone.
Google Glass, The META and Co. - How to calibrate your Optical See-Through He...Jens Grubert
Slides from our ISMAR 2014 tutorial http://stctutorial.icg.tugraz.at/
Abstract:
Head Mounted Displays such as Google Glass and the META have the potential to spur consumer-oriented Optical See-Through Augmented Reality applications. A correct spatial registration of those displays relative to a user’s eye(s) is an essential problem for any HMD-based AR application.
At our ISMAR 2014 tutorial we provide an overview of established and novel approaches for the calibration of those displays (OST calibration) including hands on experience in which participants will calibrate such head mounted displays.
A compressive approach to light field synthesis with projection devices. We propose a novel, passive screen design that is combined with high-speed light field projection and nonnegative light field factorization. We demonstrate that the projector can alternatively achieve super-resolved and high dynamic range 2D image display when used with a conventional screen.
Stereoscopic 3D: Generation Methods and Display Technologies for Industry and...Ray Phan
This was a talk I gave to a 4th year (senior-level) undergraduate class in Human Computer Interaction at Ryerson University. The talk focused on the different methods of displaying Stereoscopic 3D content, as well as the methods on generating such content. Technologies such as DLP 3DTVs, 3D theatres, and autostereoscopic displays are discussed. For the methods, 3D cameras, 2D to 3D conversion and other popular methods are discussed.
We have built a camera that can look around corners and beyond the line of sight. The camera uses light that travels from the object to the camera indirectly, by reflecting off walls or other obstacles, to reconstruct a 3D shape.
These slides use concepts from my (Jeff Funk) course entitled analyzing hi-tech opportunities to analyze how Light Field Technology is becoming economic feasible for an increasing number of applications. Light Field Cameras record all of the light fields in a picture instead of just one light field. This capability enables users to change the focus of pictures after they have been taken and to more easily record 3D data. These features are becoming economically feasible improvements because of rapid improvements in camera chips and micro-lens arrays (an example of micro-electronic mechanical systems, MEMS). These features offer alternative ways to do 3D sensing for automated vehicles and augmented reality and can enable faster data collection with telescopes.
A new gaze-contingent rendering mode for VR/AR that renders in perceptually correct ocular parallax which benefits depth perception and perceptual realism.
Head Mounted Displays: How to realize ultimate AR experiences? YutaItoh
Slides from a talk I gave at:
Wissens-Austausch-Workshop: The 3rd Workshop on Visualisierung großer Datenmengen in der Wissenschaft (VisDa3? WAW), DLR@Oberpfaffenhofen, Germany, Jun. 9 - 10, 2015.
For more detail of my work, see:
http://campar.in.tum.de/Main/YutaItoh
This ppt contains all the details of Stereoscopic imaging. It includes from history, introduction, its working technique, 3D viewers, 3D cameras, future scope, advantages, disadvantages. In all, its the complete stuff that can satisfy anyone.
Google Glass, The META and Co. - How to calibrate your Optical See-Through He...Jens Grubert
Slides from our ISMAR 2014 tutorial http://stctutorial.icg.tugraz.at/
Abstract:
Head Mounted Displays such as Google Glass and the META have the potential to spur consumer-oriented Optical See-Through Augmented Reality applications. A correct spatial registration of those displays relative to a user’s eye(s) is an essential problem for any HMD-based AR application.
At our ISMAR 2014 tutorial we provide an overview of established and novel approaches for the calibration of those displays (OST calibration) including hands on experience in which participants will calibrate such head mounted displays.
A compressive approach to light field synthesis with projection devices. We propose a novel, passive screen design that is combined with high-speed light field projection and nonnegative light field factorization. We demonstrate that the projector can alternatively achieve super-resolved and high dynamic range 2D image display when used with a conventional screen.
Stereoscopic 3D: Generation Methods and Display Technologies for Industry and...Ray Phan
This was a talk I gave to a 4th year (senior-level) undergraduate class in Human Computer Interaction at Ryerson University. The talk focused on the different methods of displaying Stereoscopic 3D content, as well as the methods on generating such content. Technologies such as DLP 3DTVs, 3D theatres, and autostereoscopic displays are discussed. For the methods, 3D cameras, 2D to 3D conversion and other popular methods are discussed.
We have built a camera that can look around corners and beyond the line of sight. The camera uses light that travels from the object to the camera indirectly, by reflecting off walls or other obstacles, to reconstruct a 3D shape.
These slides use concepts from my (Jeff Funk) course entitled analyzing hi-tech opportunities to analyze how Light Field Technology is becoming economic feasible for an increasing number of applications. Light Field Cameras record all of the light fields in a picture instead of just one light field. This capability enables users to change the focus of pictures after they have been taken and to more easily record 3D data. These features are becoming economically feasible improvements because of rapid improvements in camera chips and micro-lens arrays (an example of micro-electronic mechanical systems, MEMS). These features offer alternative ways to do 3D sensing for automated vehicles and augmented reality and can enable faster data collection with telescopes.
Introduces the use of Event Tracing for Windows in OSVR, the Open Source Virtual Reality software framework, with annotated screenshots.
See http://osvr.github.io/presentations/20150901-Intro-ETW-OSVR/ for PDF version more suitable for close investigation of the screenshots.
This guide and the relevant operating or service manual documentation for the equipment
provide full information on safe handling, commissioning and testing of this equipment and
also includes descriptions of equipment label markings.
COMP4010 Lecture 4 - VR Technology - Visual and Haptic Displays. Lecture about VR visual and haptic display technology. Taught on August 16th 2016 by Mark Billinghurst from the University of South Australia
COMP 4010 Lecture12 - Research Directions in AR and VRMark Billinghurst
COMP 4010 lecture on research directions in AR and VR, taught by Mark Billinghurst on November 2nd 2017 at the University of South Australia. This is the final lecture in the 2017 COMP 4010 course on AR and VR
Binocular Eye Trackingand Calibration in Head-mounted DisplaysMichael Stengel
Presentation slides from my talk on eye tracking and gaze-contingency for Virtual Reality applications.
In this talk I present the Eye Tracking Head-mounted Display proposed in the paper "An Affordable Solution for Binocular Eye Trackingand Calibration in Head-mounted Displays".
The paper won the "Best Student Paper Award" at ACM Multimedia 2015 in Brisbane, Australia.
Lecture 10 from the COMP 4010 course on AR/VR. This final lecture talks about future research directions in AR/VR. Taught on October 30th 2018 at the University of South Australia.
A major challenge for the next decade is to design virtual and augmented reality systems (VR at large) for real-world use cases such as healthcare, entertainment, e-education, and high-risk missions. This requires VR systems to operate at scale, in a personalized manner, remaining bandwidth-tolerant whilst meeting quality and latency criteria. One key challenge to reach this goal is to fully understand and anticipate user behaviours in these mixed reality settings.
This can be accomplished only by a fundamental revolution of the network and VR systems that have to put the interactive user at the heart of the system rather than at the end of the chain. With this goal in mind, in this talk, we describe our current researches on user-centric systems. First, we describe our view-port based streaming strategies for 360-degree video. Then, we present more in details our research on of users‘ behaviour analysis, when users interact with the 360-degree content. Specifically, we describe a set of metrics that allows us to identify key behaviours among users and quantify the level of similarity of these behaviours. Specifically, we present our clique-based clustering methodology, information theory and trajectory base in-depth analysis. Finally, we conclude with an overview of the extension of this work to navigation within volumetric video sequences.
Keynote speech given by Mark Billinghurst at the QCon 2018 conference on April 22nd in Beijing, China. The talk identified important future research directions for Augmented Reality.
Presentation on trends and future research directions in Augmented Reality. Given by Mark Billinghurst at the Smart Cloud 2015 conference on September 16th, 2015, in Seoul, Korea.
CAMAR 2.0; Context-aware Mobile Augmented Reality 2.0; R&D Activities @ GIST U-VR Lab 2009; slide presented at 12th MobileWebAppsCamp (Mobile UX and Mobile AR) in Seoul, Korea
Next Gen Computational Ophthalmic Imaging for Neurodegenerative Diseases and ...PetteriTeikariPhD
Shallow literature analysis on recent trends in computational ophthalmic imaging with focus on neurodegenerative disease imaging / oculomics.
Open-ended literature review on what you could be building next.
#1/2: Hardware
#2/2: Computational imaging
Alternative download link:
https://www.dropbox.com/scl/fi/d34pgi3xopfjbrcqj2lvi/retina_imaging_2024_computational.pdf?rlkey=xnt1dbe8rafyowocl9cbgjh3p&dl=0
Orkun Oguz (CyVision): Critical Factors for the Mass Adaption of XRAugmentedWorldExpo
A talk from the Develop Track at AWE USA 2018 - the World's #1 XR Conference & Expo in Santa Clara, California May 30- June 1, 2018.
Orkun Oguz (CyVision): Critical Factors for the Mass Adaption of XR
This session focuses on ‘Visual Comfort’ as an additional factor for the success of the XR industry looking at natural viewing experience, high latency and vergence & accommodation conflict and the need to present information with true depth cues even for one eye, which is required for comfortable viewing experience without any visual conflicts.
http://AugmentedWorldExpo.com
Augmented Reality: The Next 20 Years (AWE Asia 2015)Mark Billinghurst
Keynote speech given by Mark Billinghurst at the AWE Asia 2015 conference on October 18th 2015. The talk gives an outline of future developments in Augmented Reality
This is a guest lecture given by Mark Billinghurst at the University of Sydney on March 27th 2024. It discusses some future research directions for Augmented Reality.
Similar to VR2.0: Making Virtual Reality Better Than Reality? (20)
Imaging objects obscured by occluders is a significant challenge for many applications. A camera that could “see around corners” could help improve navigation and mapping capabilities of autonomous vehicles or make search and rescue missions more effective. Time-resolved single-photon imaging systems have recently been demonstrated to record optical information of a scene that can lead to an estimation of the shape and reflectance of objects hidden from the line of sight of a camera. However, existing non-line-of-sight (NLOS) reconstruction algorithms have been constrained in the types of light transport effects they model for the hidden scene parts. We introduce a factored NLOS light transport representation that accounts for partial occlusions and surface normals. Based on this model, we develop a factorization approach for inverse time-resolved light transport and demonstrate high-fidelity NLOS reconstructions for challenging scenes both in simulation and with an experimental NLOS imaging system.
Hybrid optimization of pumped hydro system and solar- Engr. Abdul-Azeez.pdffxintegritypublishin
Advancements in technology unveil a myriad of electrical and electronic breakthroughs geared towards efficiently harnessing limited resources to meet human energy demands. The optimization of hybrid solar PV panels and pumped hydro energy supply systems plays a pivotal role in utilizing natural resources effectively. This initiative not only benefits humanity but also fosters environmental sustainability. The study investigated the design optimization of these hybrid systems, focusing on understanding solar radiation patterns, identifying geographical influences on solar radiation, formulating a mathematical model for system optimization, and determining the optimal configuration of PV panels and pumped hydro storage. Through a comparative analysis approach and eight weeks of data collection, the study addressed key research questions related to solar radiation patterns and optimal system design. The findings highlighted regions with heightened solar radiation levels, showcasing substantial potential for power generation and emphasizing the system's efficiency. Optimizing system design significantly boosted power generation, promoted renewable energy utilization, and enhanced energy storage capacity. The study underscored the benefits of optimizing hybrid solar PV panels and pumped hydro energy supply systems for sustainable energy usage. Optimizing the design of solar PV panels and pumped hydro energy supply systems as examined across diverse climatic conditions in a developing country, not only enhances power generation but also improves the integration of renewable energy sources and boosts energy storage capacities, particularly beneficial for less economically prosperous regions. Additionally, the study provides valuable insights for advancing energy research in economically viable areas. Recommendations included conducting site-specific assessments, utilizing advanced modeling tools, implementing regular maintenance protocols, and enhancing communication among system components.
Saudi Arabia stands as a titan in the global energy landscape, renowned for its abundant oil and gas resources. It's the largest exporter of petroleum and holds some of the world's most significant reserves. Let's delve into the top 10 oil and gas projects shaping Saudi Arabia's energy future in 2024.
About
Indigenized remote control interface card suitable for MAFI system CCR equipment. Compatible for IDM8000 CCR. Backplane mounted serial and TCP/Ethernet communication module for CCR remote access. IDM 8000 CCR remote control on serial and TCP protocol.
• Remote control: Parallel or serial interface.
• Compatible with MAFI CCR system.
• Compatible with IDM8000 CCR.
• Compatible with Backplane mount serial communication.
• Compatible with commercial and Defence aviation CCR system.
• Remote control system for accessing CCR and allied system over serial or TCP.
• Indigenized local Support/presence in India.
• Easy in configuration using DIP switches.
Technical Specifications
Indigenized remote control interface card suitable for MAFI system CCR equipment. Compatible for IDM8000 CCR. Backplane mounted serial and TCP/Ethernet communication module for CCR remote access. IDM 8000 CCR remote control on serial and TCP protocol.
Key Features
Indigenized remote control interface card suitable for MAFI system CCR equipment. Compatible for IDM8000 CCR. Backplane mounted serial and TCP/Ethernet communication module for CCR remote access. IDM 8000 CCR remote control on serial and TCP protocol.
• Remote control: Parallel or serial interface
• Compatible with MAFI CCR system
• Copatiable with IDM8000 CCR
• Compatible with Backplane mount serial communication.
• Compatible with commercial and Defence aviation CCR system.
• Remote control system for accessing CCR and allied system over serial or TCP.
• Indigenized local Support/presence in India.
Application
• Remote control: Parallel or serial interface.
• Compatible with MAFI CCR system.
• Compatible with IDM8000 CCR.
• Compatible with Backplane mount serial communication.
• Compatible with commercial and Defence aviation CCR system.
• Remote control system for accessing CCR and allied system over serial or TCP.
• Indigenized local Support/presence in India.
• Easy in configuration using DIP switches.
Overview of the fundamental roles in Hydropower generation and the components involved in wider Electrical Engineering.
This paper presents the design and construction of hydroelectric dams from the hydrologist’s survey of the valley before construction, all aspects and involved disciplines, fluid dynamics, structural engineering, generation and mains frequency regulation to the very transmission of power through the network in the United Kingdom.
Author: Robbie Edward Sayers
Collaborators and co editors: Charlie Sims and Connor Healey.
(C) 2024 Robbie E. Sayers
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)MdTanvirMahtab2
This presentation is about the working procedure of Shahjalal Fertilizer Company Limited (SFCL). A Govt. owned Company of Bangladesh Chemical Industries Corporation under Ministry of Industries.
Final project report on grocery store management system..pdfKamal Acharya
In today’s fast-changing business environment, it’s extremely important to be able to respond to client needs in the most effective and timely manner. If your customers wish to see your business online and have instant access to your products or services.
Online Grocery Store is an e-commerce website, which retails various grocery products. This project allows viewing various products available enables registered users to purchase desired products instantly using Paytm, UPI payment processor (Instant Pay) and also can place order by using Cash on Delivery (Pay Later) option. This project provides an easy access to Administrators and Managers to view orders placed using Pay Later and Instant Pay options.
In order to develop an e-commerce website, a number of Technologies must be studied and understood. These include multi-tiered architecture, server and client-side scripting techniques, implementation technologies, programming language (such as PHP, HTML, CSS, JavaScript) and MySQL relational databases. This is a project with the objective to develop a basic website where a consumer is provided with a shopping cart website and also to know about the technologies used to develop such a website.
This document will discuss each of the underlying technologies to create and implement an e- commerce website.
VR2.0: Making Virtual Reality Better Than Reality?
1. Making Virtual Reality better than Reality?
Gordon Wetzstein
Stanford University
IS&T Electronic Imaging 2017
www.computationalimaging.org
2.
3. Personal Computer
e.g. Commodore PET 1983
Laptop
e.g. Apple MacBook
Smartphone
e.g. Google Pixel
AR/VR
e.g. Microsoft Hololens
???
4. A Brief History of Virtual Reality
1838 1968 2012-2017
Stereoscopes
Wheatstone, Brewster, …
VR & AR
Ivan Sutherland
VR explosion
Oculus, Sony, HTC, MS, …
Nintendo
Virtual Boy
1995
VR 2.0
17. Presbyopia
[Katz et al. 1997]
68%
age 80+
43%
age 40
25%
Hyperopia
[Krachmer et al. 2005]
Myopia
41.6%
[Vitale et al. 2009]
How Many People Have Normal Vision?
all numbers of US population
18. 4D / 25cm Optical Infinity
Normal vision
Nearsighted/myopic
Farsighted/Hyperopic
Presbyopic
Focal range (range of clear vision)
Modified from Pamplona et al, Proc. of SIGGRAPH 2010
Nearsightedness & Farsightedness
20. • Q1: Can computational displays effectively replace glasses
in VR/AR?
• Q2: How to address the vergence-accommodation conflict
for users of different ages?
• Q3: What are (in)effective near-eye display technologies?
possible solutions: gaze-contingent focus, monovision,
multiplane, light field displays, …
21. • Q1: Can computational displays effectively replace glasses
in VR/AR?
• Q2: How to address the vergence-accommodation conflict
for users of different ages?
• Q3: What are (in)effective near-eye display technologies?
possible solutions: gaze-contingent focus, monovision,
multiplane, light field displays, …
25. Adaptive Focus - History
• M. Heilig “Sensorama”, 1962 (US Patent #3,050,870)
• P. Mills, H. Fuchs, S. Pizer “High-Speed Interaction On A Vibrating-Mirror 3D Display”, SPIE 0507 1984
• S. Shiwa, K. Omura, F. Kishino “Proposal for a 3-D display with accommodative compensation: 3DDAC”, JSID 1996
• S. McQuaide, E. Seibel, J. Kelly, B. Schowengerdt, T. Furness “A retinal scanning display system that produces multiple focal planes with
a deformable membrane mirror”, Displays 2003
• S. Liu, D. Cheng, H. Hua “An optical see-through head mounted display with addressable focal planes”, Proc. ISMAR 2008
manual focus adjustment
Heilig 1962
automatic focus adjustment
Mills 1984
deformabe mirrors & lenses
McQuaide 2003, Liu 2008
39. How sharp is the target? (blurry, medium, sharp)
Is the target fused? (yes, no)
4D
(0.25m)
3D
(0.33m)
2D
(0.50m)
1D
(1m)
Four simulated distances
Task
43. Mean = 0.63
Mean = 0.60
far near
far near
1D
1m
2D
0.5m
3D
0.3m
4D
0.25m
Distance
medium 0
sharp 1
blurry -1
Relativesharpness
VR uncorrected
VR corrected
normal correction
Results - Sharpness
Padmanaban et al., PNAS 2017
46. Computational Near-eye Displays
• Q1: Can computational displays effectively replace glasses
in VR/AR?
• Q2: How to address the vergence-accommodation conflict
for users of different ages?
• Q3: What are (in)effective near-eye display technologies?
possible solutions: gaze-contingent focus, monovision,
light field displays, …
48. • Visual discomfort (eye tiredness & eyestrain) after ~20 minutes of
stereoscopic depth judgments (Hoffman et al. 2008; Shibata et al.
2011)
• Degrades visual performance in terms of reaction times and acuity
for stereoscopic vision (Hoffman et al. 2008; Konrad et al. 2016;
Johnson et al. 2016)
Consequences of Vergence-Accommodation Conflict
57. Padmanaban et al., PNAS 2017
Do Presbyopes Benefit from Dynamic Focus?
Gain
Age
58. Padmanaban et al., PNAS 2017
Do Presbyopes Benefit from Dynamic Focus?
Gain
Age
conventional
59. Padmanaban et al., PNAS 2017
Do Presbyopes Benefit from Dynamic Focus?
Gain
Age
conventional
dynamic
60. Padmanaban et al., PNAS 2017
Do Presbyopes Benefit from Dynamic Focus?
Gain
Age
conventional
dynamic
Response for Physical Stimulus
Heron & Charman 2004
61. far near far near
Padmanaban et al., PNAS 2017
Age-dependent FusionPercentFused
62. far near far near
Padmanaban et al., PNAS 2017
Age-dependent FusionPercentFused
63. far near far near
Padmanaban et al., PNAS 2017
Age-dependent FusionPercentFused
64. far near far near
Padmanaban et al., PNAS 2017
Age-dependent SharpnessRelativeSharpness
65. far near far near
Padmanaban et al., PNAS 2017
Age-dependent SharpnessRelativeSharpness
66. far near far near
Padmanaban et al., PNAS 2017
Age-dependent SharpnessRelativeSharpness
67. • Q1: Can computational displays effectively replace glasses
in VR/AR?
• Q2: How to address the vergence-accommodation conflict
for users of different ages?
• Q3: What are (in)effective near-eye display technologies?
possible solutions: gaze-contingent focus, monovision,
multiplane, light field displays, …
68. Gaze-contingent Focus
• non-presbyopes: adaptive focus is like real world, but needs eye tracking!
HMD
lens
micro
display
virtual image
eye
tracking
Padmanaban et al., PNAS 2017
74. Monovision VR
Konrad et al., SIGCHI 2016; Johnson et al., Optics Express 2016; Padmanaban et al., PNAS 2017
75. Monovision VR
Konrad et al., SIGCHI 2016; Johnson et al., Optics Express 2016; Padmanaban et al., PNAS 2017
• monovision did not drive accommodation
more than conventional
• visually comfortable for most; particularly
uncomfortable for some users
76. Multiplane VR Displays
• Rolland J, Krueger M, Goon A (2000) Multifocal planes head-mounted displays. Applied Optics 39
• Akeley K, Watt S, Girshick A, Banks M (2004) A stereo display prototype with multiple focal distances. ACM Trans. Graph. (SIGGRAPH)
• Waldkirch M, Lukowicz P, Tröster G (2004) Multiple imaging technique for extending depth of focus in retinal displays. Optics Express
• Schowengerdt B, Seibel E (2006) True 3-d scanned voxel displays using single or multiple light sources. JSID
• Liu S, Cheng D, Hua H (2008) An optical see-through head mounted display with addressable focal planes in Proc. ISMAR
• Love GD et al. (2009) High-speed switchable lens enables the development of a volumetric stereoscopic display. Optics Express
• … many more ...
idea introduced
Rolland et al. 2000
benchtop prototype
Akeley 2004
near-eye display prototype
Liu 2008, Love 2009
77. Multiplane VR Displays
• Rolland J, Krueger M, Goon A (2000) Multifocal planes head-mounted displays. Applied Optics 39
• Akeley K, Watt S, Girshick A, Banks M (2004) A stereo display prototype with multiple focal distances. ACM Trans. Graph. (SIGGRAPH)
• Waldkirch M, Lukowicz P, Tröster G (2004) Multiple imaging technique for extending depth of focus in retinal displays. Optics Express
• Schowengerdt B, Seibel E (2006) True 3-d scanned voxel displays using single or multiple light sources. JSID
• Liu S, Cheng D, Hua H (2008) An optical see-through head mounted display with addressable focal planes in Proc. ISMAR
• Love GD et al. (2009) High-speed switchable lens enables the development of a volumetric stereoscopic display. Optics Express
• … many more ...
idea introduced
Rolland et al. 2000
benchtop prototype
Akeley 2004
near-eye display prototype
Liu 2008, Love 2009
92. Diffraction in Multilayer Light Field Displays
Wetzstein et al., SIGGRAPH 2011
Lanman et al., SIGGRAPH Asia 2011
Wetzstein et al., SIGGRAPH 2012
Maimone et all., Trans. Graph. 2013
…
Hirsch et al, SIGGRAPH 2014
No diffraction artifacts with LCoS
blur!
93. Summary
• focus cues in VR/AR are challenging
• adaptive focus can correct for refractive errors (myopia, hyperopia)
• gaze-contingent focus gives natural focus cues for non-presbyopes, but
require eyes tracking
• presbyopes require fixed focal plane with correction
• multiplane displays require very high speed microdisplays
• monovision has not demonstrated significant improvements
• light field displays may be the “ultimate” display need to solve “diffraction
problem”
94. Making Virtual Reality Better Than Reality?
• focus cues in VR/AR are challenging
• adaptive focus can correct for refractive errors (myopia, hyperopia)
• gaze-contingent focus gives natural focus cues for non-presbyopes, but
require eyes tracking
• presbyopes require fixed focal plane with correction, better than reality!
• multiplane displays require very high speed microdisplays
• monovision has not demonstrated significant improvements
• light field displays may be the “ultimate” display need to solve “diffraction
problem”
95. VR/AR = Frontier of Engineering
• Focus cues / visual accessibility
• Vestibular-visual conflict (motion sickness)
• AR • occlusions
• aesthetics / form factor
• battery life
• heat
• wireless operation
• low-power computer vision
• registration of physical /
virtual world and eyes
• consistent lighting
• scanning real world
• VAC more important
• display contrast &
brightness
• fast, embedded GPUs
• …
109. Advancing AR/VR technology requires deep
understanding of human vision, optics, signal processing,
computation, and more.
Technology alone is not enough – engineer experiences!
Conclusions
111. Stanford Computational Imaging Lab
Light Field Displays
Time-of-Flight Imaging
Computational
Microscopy
Image Optimization
Light Field Cameras
Near-eye Displays
112. Stanford Computational Imaging Lab
Open Lab this Friday (2/3) 10am-3pm at Stanford
Please email Helen Lin (helenlin@stanford.edu) for more details!
- "The Light Field Stereoscope" (demo), R. Konrad
- "Gaze-contingent and Varifocal Near-eye Displays" (demo), N. Padmanaban
- "Monovision Near-eye Displays" (demo), R. Konrad
- "Saliency in VR: How do People Explore Virtual Environments?" (poster and demo), V. Sitzmann
- "Accommodation-invariant Computational Near-eye Displays" (demo), R. Konrad
- "Depth-dependent Visual Anchoring for Reducing Motion Sickness in VR", N. Padmanaban
- "ProxImaL: Efficient Image Optimization using Proximal Algorithms" (poster), F. Heide
- "Dirty Pixels: Optimizing Image Classification Architectures for Raw Sensor Data" (poster), S. Diamond
- "Vortex: Live Cinematic Virtual Reality" (demo), R. Konrad
- "Transient Imaging with Single Photon Detectors" (poster), M. O'Toole
- "Robust Non-line-of-sight Imaging with Single Photon Detectors" (poster), F. Heide
- "Variable Aperture Light Field Photography" (poster), J. Chang
- "Computational Time-of-Flight Photography" (poster), F. Heide
- "Wide Field-of-View Monocentric Light Field Imaging" (poster and demo), D. Dansereau
- "Hacking the Vive Lighthouse - Arduino-based Positional Tracking in VR with Low-cost Components" (demo), K.
113. Acknowledgements
Near-eye Displays
• Robert Konrad (Stanford)
• Nitish Padmanaban (Stanford)
• Fu-Chung Huang (NVIDIA)
• Emily Cooper (Dartmouth College)
Saliency in VR
• Vincent Sitzmann (Stanford)
• Diego Gutierrez (U. Zaragoza)
• Ana Serrano (U. Zaragoza)
• Maneesh Agrawala (Stanford)
Spinning VR Camera
• Robert Konrad (Stanford)
• Donald Dansereau (Stanford)
Other
• Wolfgang Heidrich (UBC/KAUST)
• Ramesh Raskar (MIT/Facebook)
• Douglas Lanman (Oculus)
• Matt Hirsch (Lumii)
• Matthew O’Toole (Stanford)
• Felix Heide (Stanford)