Light Field Technology
Chen Zhao, Chen Zhi, John Allen Ray, Ung Guan Wah, Zhou Xintao
For information on other technologies, please see Jeff Funk’s slide share account
(http://www.slideshare.net/Funk98/presentations) or his book with Chris Magee:
Exponential Change: What drives it? What does it tell us about the future?
http://www.amazon.com/Exponential-Change-drives-about-future-
ebook/dp/B00HPSAYEM/ref=sr_1_1?ie=UTF8&qid=1398325920&sr=8-1&keywords=exponential+change
What is light field technology?
• Light goes through the main lens and then hit a special lens
• The special lens incorporates a compound lens known as a
micro-lens array (MLA)
• MLA re-group sensors (e.g. CCD, CMOS) into small groups
• Capture information on the direction, color, and luminosity of
millions of individual rays of light
• Computationally reconstruct the light field (3D)
• http://www.youtube.com/watch?v=7babcK2GH3I
Lytro camera
Generalized conceptual
diagram of light field camera
Driving Force
Refocusing pictures
Optical correction
3D scanning, modeling
Price drop on image sensor and MLA
Timeline of Light Field Camera
(LFC)
1992 Plenoptic camera proposed in research
2006 Refocus Imaging (later Lytro)
2008 Foundation of Raytrix
2010 R11 first commercialized LF camera, 30,000 euros, 3 megapixels
2011 Lytro launched, first consumer LFC, 499USD. 1.1 megapixels
2014 Jan, Toshiba started sample shipment of Dual camera module
with depth data, 50 USD, 13 megapixels
2015 first half, Pelican Imaging shipment planned.
Rates of Improvement (LFC)
1
10
100
1000
10000
100000
2009 2010 2011 2012 2013 2014 2015 2016
ProductioncostperMegapixels(USD)
R11
Lytro
Toshiba
Pelican
0
100000
200000
300000
400000
500000
600000
2009 2010 2011 2012 2013 2014 2015
Totalnumberofmicrolenses
R11
Lytro
Toshiba
Year Model Cost (USD) Megapixel Pixel Size (um) Microlenses Lense size (um)
2010 R11 39,000 3 9 40,000 200
2011 Lytro 499 1 1.4 130,000 13.89
2014 Toshiba 50 13 1.4 500,000 30
? Pelican 20 8 ? ? ?
T. Suzuki, “Challenges of Image-Sensor Development”, ISSCC, 2010
http://www.future-fab.com/documents.asp?d_ID=4926
Changes in Scale Impacted on Cost per pixel of Camera Chips
Microlens Array (MLA)
Modern fabrication methods:
• Photolithography based on
standard semiconductor
processing technology
• Feature size less than 1mm and
often as small as 10 um
Other applications:
• Coupling light to optical fibres
• Increase light collection
efficiency of CCD arrays
• Focus light in digital projectors
• Concentrators for high efficiency
photovoltaics
Applications
Cameras
3D Sensing
Augment RealityTelescope/Microscope
Autonomous vehicles
Mobile Software Refocus
2014 is becoming the year of mobile software refocus
Nokia Lumia https://refocus.nokia.com/
Sony XperiaTM Z2 (waterproof)
Samsung Galaxy S5
LG G Pro 2
Meizu MX3
Cameras - Reduction in size
Toshiba
• 18.0 mm x 12.0 mm x 4.65 mm
• Dual camera module
• 13 Megapixels
• CMOS Light Field sensor
• 500,000 microlenses (5X more
than Lytro)
• Module shipped in Jan 2014
• Sample US$50
Cameras - Reduction in size
Pelican Imaging
• Developed extremely thin, and
cheap light field camera module
• Height 3mm
• Array of small cameras
• Production cost US$20
• 8 Megapixels output
• Future Nokia Lumia confirmed
with this feature
http://www.youtube.com/watch?v=
Nleclfgqn_U
Applications
Cameras
3D
Sensing
Augment RealityTelescope/Microscope
Autonomous vehicles
A light-field picture contains more information about
depth than simply correspondence which allow us to
capture the real world in unparalleled detail (4D).
Rendering complexity is independent of scene
complexity.
Processing speed is fast
No need to worry about the focus of the scanner
lenses.
Why light field for 3D sensing
is better
1. Stereoscopic vision:
Currently most common 3D sensor approach
Passive range determination via stereoscopic vision utilizes
the disparity in viewpoints between a pair of near-identical
cameras to measure the distance to a subject of interest.
3D sensing Technology
2. Structured light:
Replaces the stereoscopic vision sensor's second imaging sensor with
a projection component.
Similar to stereoscopic vision techniques, this approach takes
advantage of the known camera-to-projector separation to locate a
specific point between them and compute the depth with
triangulation algorithms.
3D sensing Technology
3. Time of flight
An indirect system to obtain travel time information by
measuring the delay or phase shift of a modulated optical signal
for all pixels in the scene.
The ToF sensor in the system consists of an array of pixels,
where each pixel is capable of determining the distance to
the scene.
3D sensing Technology
Stereoscopic
vision
Structured light Time of
flight(TOF)
Light Field
Software
complexity
High High Low Low
Material Cost Low High/Middle Middle Low
Response time Middle Slow Fast Fast
Low light Weak Light source dep
(IR or visible)
Good (IR, laser) Good
Outdoor Good Weak Fair Good
Depth (“z”)
accuracy
cm μm ~ cm mm ~ cm μm ~ cm
Range Mid range Very short
range(cm) to
mid range (4-
6m)
Short
range(<1m) to
long
range(~40m)
Very short
range(cm) to long
rang (~100m)
Application
Device control
3D movie
3D scanning
3D-vision gesture control system, which is a highly precise
and reliable user interface for interacting with any display
screen from any distance. Whether on a personal
computer, set top box, television set, mobile device, game
console, digital sign or interactive kiosk, The depth
tracking software enables users to control onscreen
interaction with simple hand motions instead of a remote
control, keyboard or touch screen.
3D sensing control
3D Dental Scanner
3D scanning for arch
Interactive Billboard
Interactive mirror
Future Retail Industry
Mechanical Parts Scanning
For Lockheed Martin, 3D scanner assures the right
fit the first time
Car scanning
Applications
Cameras
3D Sensing
Augment
Reality
Telescope/Microscope
Autonomous vehicles
Is this real?
Created from Light field AR
Created from Hologram
http://www.youtube.com/watch?feature=p
layer_embedded&v=pky822zG4hM
• Without the benefit of clear natural sight,
such advances light field in AR are
extraordinarily helpful
• Ability to see more - inside of a patient
(Diagnosis and Therapy)
• Limited – type of displays are cumbersome
• New equipment such as transparent screens
- Displaying information and graphics about
the person's condition,
- combination of visualization and location
tracking technology,
• Medical AR technology compared with light
field, ultrasound and location technology
AR using Light Field
3D MOULD
• More details
• Bring it every where you can
AR Tool kit (VRML) vs. Rendering Light field
Reference: https://www.academia.edu/5470506/AN_AUGMENTED_REALITY_SYSTEM_BASED_ON_LIGHT_FIELDS
AR Tool kit (VRML) vs. Rendering Light field
Detail Processing
• Constant Response
• Shorter Response
Reference: https://www.academia.edu/5470506/AN_AUGMENTED_REALITY_SYSTEM_BASED_ON_LIGHT_FIELDS
Higher processing requirement
Traditional ARNearing peak of
inflated expectation
Break away from
Monitor & display
Level 0 - Physical World Hyper Linkin
Level 1 - Marker Based AR
Level 2 - Markerless AR
Level 3 - Augmented Vision
http://www.sprxmobile.com/the-augmented-reality-hype-cycle/
Distribution of AR Application on mobile
• Increase demand in mobile device application
• Install with Light Field Camera for fast response
• “Shot and Focus” later
http://www.augmentedplanet.com/2010/06/the-mobile-augmented-reality-competitive-landscape/
Representation Hologram Vs Light Field
a) Depicts the representation of a hologram.
b) and c) show two different representations of a
light field.
Reference: Eurographics 2007/D.Cohen-Or and P.Slavik, A Bidirectional Light Field-Hologram Transform Volume 26 (2007), Number 3
Light Field Mapping
• The possibility to transform a light field into a holographic
representation and vice versa.
• The holographic data representation is similar to a light field
• “M” transforms the light field into a holographic representation.
• Method to extract depth from the input light field.
• If accurate depth information is available for the light field it can
optionally be added to the input
Reference: Eurographics 2007/D.Cohen-Or and P.Slavik, A Bidirectional Light Field-Hologram Transform Volume 26 (2007), Number 3
Light Field Mapping
• Depth Reconstruction from Light Fields
• Effects of Loss of Data from Hologram
Reference: Eurographics 2007/D.Cohen-Or and P.Slavik, A Bidirectional Light Field-Hologram Transform Volume 26 (2007), Number 3
Hologram Vs Light Field
Reference: Eurographics 2007/D.Cohen-Or and P.Slavik, A Bidirectional Light Field-Hologram Transform Volume 26 (2007), Number 3
Hologram Vs Light Field
• Hologram - an illustration of direct output of holographic
content on future generation holographic displays.
• Light field - far more efficient for conventional 2D frame buffer
displays.
• Light Field - the versatility and the power of transformation on
synthetic light fields, real light fields and digitally recorded
holograms.
• The rendered images can be evaluated directly from the
holographic representation or through light field rendering.
• Light field capable of simulating different aperture sizes as well
as focal length – Versatile displays
Future
• Holograms can be captured using a light field camera
• Take advantage of the realism and detail preserving benefits of
a real light field while giving the possibility of a 3D output on a
holographic screen
AR Light Field Possible Improvements
• See-through displays
• New tracking sensors
• Interfaces and Interactions
Applications
Cameras
3D Sensing
Augment
Reality
Telescope/M
icroscope
Autonomous
vehicles
Scientific Applications
Light-Field Telescope
Light-Field Microscope
Light-Field Telescope
Source: JonathanWedd, Jan van der Laan, Eric Lavelle, and
David Stoker. “A High-Magnification Light-Field Telescope
forExtended Depth-of-Field Biometric Imaging”
Light-Field Telescope
 Take the image faster and refocus the image after
taking
 Increase magnification
 Achieve large depth of field and high lateral image
resolution simultaneously
 Capture different wavelengths
Light Field Microscope
 A compact Light Field
Microscope Designed by
Stanford
 Consists of an ordinary
research microscope and
cooled scientific camera
 A microlens array is
inserted
Source: Marc Levoy, Ren Ng, Andrew Adams, Matthew Footer, Mark
Horowitz. “Light Field Microscopy”. Stanford University
Light Field Microscope
Source: Marc Levoy, Ren Ng, Andrew Adams, Matthew Footer, Mark
Horowitz. “Light Field Microscopy”. Stanford University
Light Field Microscope
 Captures light fields of biological
specimens in a single snapshot
 Offers 3D functional imaging of neuronal
activity in entire organisms at single cell level
 Separates image acquisition from the
selection of viewpoint and focus
 Captures video of high speed moving
specimens
Applications
Cameras
3D Sensing
Augment RealityTelescope/Microscope
Autonomous
vehicles
Autonomous Vehicle
Applications
Consumer
Passenger Vehicles
Agricultural Vehicles
Military
Combat Vehicles
Logistics/Supplies
Search & Rescue Vehicles
M.Bellone, et al, “Unevenness Point Descriptor for Terrain Analysis in Mobile Robot Applications”, 2012
http://cdn.intechopen.com/pdfs-wm/45459.pdf
LFC Vs. Laser Based Systems
Advantages:
Easier data interpretation
Significantly Lower Cost
Lower Power Requirement
Comparable performance
Disadvantages:
Smaller field of view (multiple cameras required)
D. Stavens, “LEARNING TO DRIVE: PERCEPTION FOR AUTONOMOUS CARS”, 2011
http://purl.stanford.edu/pb661px9942
Light Field Depth Map
M.Tao, et al, “Depth from Combining Defocus and Correspondence Using Light-Field Cameras”, 2013
http://www.cs.berkeley.edu/~ravir/lightfield_ICCV.pdf
Light Field Terrain Analysis
M.Bellone, et al, “Unevenness Point Descriptor for Terrain Analysis in Mobile Robot Applications”, 2012
http://cdn.intechopen.com/pdfs-wm/45459.pdf
Light Field Terrain Analysis
M.Bellone, et al, “Unevenness Point Descriptor for Terrain Analysis in Mobile Robot Applications”, 2012
http://cdn.intechopen.com/pdfs-wm/45459.pdf
Future of Light Field
We have analyzed advantages of LF
We showed applications of LF in cameras, mobile
phones, 3D scanning, AR, scientific research, and
autonomous vehicles
Future applications of LF will enable us to do more
simple and fast 3D applications

Light Field Technology

  • 1.
    Light Field Technology ChenZhao, Chen Zhi, John Allen Ray, Ung Guan Wah, Zhou Xintao For information on other technologies, please see Jeff Funk’s slide share account (http://www.slideshare.net/Funk98/presentations) or his book with Chris Magee: Exponential Change: What drives it? What does it tell us about the future? http://www.amazon.com/Exponential-Change-drives-about-future- ebook/dp/B00HPSAYEM/ref=sr_1_1?ie=UTF8&qid=1398325920&sr=8-1&keywords=exponential+change
  • 2.
    What is lightfield technology? • Light goes through the main lens and then hit a special lens • The special lens incorporates a compound lens known as a micro-lens array (MLA) • MLA re-group sensors (e.g. CCD, CMOS) into small groups • Capture information on the direction, color, and luminosity of millions of individual rays of light • Computationally reconstruct the light field (3D) • http://www.youtube.com/watch?v=7babcK2GH3I Lytro camera Generalized conceptual diagram of light field camera
  • 3.
    Driving Force Refocusing pictures Opticalcorrection 3D scanning, modeling Price drop on image sensor and MLA
  • 4.
    Timeline of LightField Camera (LFC) 1992 Plenoptic camera proposed in research 2006 Refocus Imaging (later Lytro) 2008 Foundation of Raytrix 2010 R11 first commercialized LF camera, 30,000 euros, 3 megapixels 2011 Lytro launched, first consumer LFC, 499USD. 1.1 megapixels 2014 Jan, Toshiba started sample shipment of Dual camera module with depth data, 50 USD, 13 megapixels 2015 first half, Pelican Imaging shipment planned.
  • 5.
    Rates of Improvement(LFC) 1 10 100 1000 10000 100000 2009 2010 2011 2012 2013 2014 2015 2016 ProductioncostperMegapixels(USD) R11 Lytro Toshiba Pelican 0 100000 200000 300000 400000 500000 600000 2009 2010 2011 2012 2013 2014 2015 Totalnumberofmicrolenses R11 Lytro Toshiba Year Model Cost (USD) Megapixel Pixel Size (um) Microlenses Lense size (um) 2010 R11 39,000 3 9 40,000 200 2011 Lytro 499 1 1.4 130,000 13.89 2014 Toshiba 50 13 1.4 500,000 30 ? Pelican 20 8 ? ? ?
  • 6.
    T. Suzuki, “Challengesof Image-Sensor Development”, ISSCC, 2010 http://www.future-fab.com/documents.asp?d_ID=4926 Changes in Scale Impacted on Cost per pixel of Camera Chips
  • 7.
    Microlens Array (MLA) Modernfabrication methods: • Photolithography based on standard semiconductor processing technology • Feature size less than 1mm and often as small as 10 um Other applications: • Coupling light to optical fibres • Increase light collection efficiency of CCD arrays • Focus light in digital projectors • Concentrators for high efficiency photovoltaics
  • 8.
  • 9.
    Mobile Software Refocus 2014is becoming the year of mobile software refocus Nokia Lumia https://refocus.nokia.com/ Sony XperiaTM Z2 (waterproof) Samsung Galaxy S5 LG G Pro 2 Meizu MX3
  • 10.
    Cameras - Reductionin size Toshiba • 18.0 mm x 12.0 mm x 4.65 mm • Dual camera module • 13 Megapixels • CMOS Light Field sensor • 500,000 microlenses (5X more than Lytro) • Module shipped in Jan 2014 • Sample US$50
  • 11.
    Cameras - Reductionin size Pelican Imaging • Developed extremely thin, and cheap light field camera module • Height 3mm • Array of small cameras • Production cost US$20 • 8 Megapixels output • Future Nokia Lumia confirmed with this feature http://www.youtube.com/watch?v= Nleclfgqn_U
  • 12.
  • 13.
    A light-field picturecontains more information about depth than simply correspondence which allow us to capture the real world in unparalleled detail (4D). Rendering complexity is independent of scene complexity. Processing speed is fast No need to worry about the focus of the scanner lenses. Why light field for 3D sensing is better
  • 14.
    1. Stereoscopic vision: Currentlymost common 3D sensor approach Passive range determination via stereoscopic vision utilizes the disparity in viewpoints between a pair of near-identical cameras to measure the distance to a subject of interest. 3D sensing Technology
  • 15.
    2. Structured light: Replacesthe stereoscopic vision sensor's second imaging sensor with a projection component. Similar to stereoscopic vision techniques, this approach takes advantage of the known camera-to-projector separation to locate a specific point between them and compute the depth with triangulation algorithms. 3D sensing Technology
  • 16.
    3. Time offlight An indirect system to obtain travel time information by measuring the delay or phase shift of a modulated optical signal for all pixels in the scene. The ToF sensor in the system consists of an array of pixels, where each pixel is capable of determining the distance to the scene. 3D sensing Technology
  • 17.
    Stereoscopic vision Structured light Timeof flight(TOF) Light Field Software complexity High High Low Low Material Cost Low High/Middle Middle Low Response time Middle Slow Fast Fast Low light Weak Light source dep (IR or visible) Good (IR, laser) Good Outdoor Good Weak Fair Good Depth (“z”) accuracy cm μm ~ cm mm ~ cm μm ~ cm Range Mid range Very short range(cm) to mid range (4- 6m) Short range(<1m) to long range(~40m) Very short range(cm) to long rang (~100m) Application Device control 3D movie 3D scanning
  • 18.
    3D-vision gesture controlsystem, which is a highly precise and reliable user interface for interacting with any display screen from any distance. Whether on a personal computer, set top box, television set, mobile device, game console, digital sign or interactive kiosk, The depth tracking software enables users to control onscreen interaction with simple hand motions instead of a remote control, keyboard or touch screen. 3D sensing control
  • 19.
  • 21.
  • 22.
  • 23.
    Mechanical Parts Scanning ForLockheed Martin, 3D scanner assures the right fit the first time Car scanning
  • 24.
  • 25.
    Is this real? Createdfrom Light field AR Created from Hologram
  • 26.
    http://www.youtube.com/watch?feature=p layer_embedded&v=pky822zG4hM • Without thebenefit of clear natural sight, such advances light field in AR are extraordinarily helpful • Ability to see more - inside of a patient (Diagnosis and Therapy) • Limited – type of displays are cumbersome • New equipment such as transparent screens - Displaying information and graphics about the person's condition, - combination of visualization and location tracking technology, • Medical AR technology compared with light field, ultrasound and location technology
  • 27.
    AR using LightField 3D MOULD • More details • Bring it every where you can
  • 28.
    AR Tool kit(VRML) vs. Rendering Light field Reference: https://www.academia.edu/5470506/AN_AUGMENTED_REALITY_SYSTEM_BASED_ON_LIGHT_FIELDS
  • 29.
    AR Tool kit(VRML) vs. Rendering Light field Detail Processing • Constant Response • Shorter Response Reference: https://www.academia.edu/5470506/AN_AUGMENTED_REALITY_SYSTEM_BASED_ON_LIGHT_FIELDS
  • 30.
    Higher processing requirement TraditionalARNearing peak of inflated expectation Break away from Monitor & display Level 0 - Physical World Hyper Linkin Level 1 - Marker Based AR Level 2 - Markerless AR Level 3 - Augmented Vision http://www.sprxmobile.com/the-augmented-reality-hype-cycle/
  • 31.
    Distribution of ARApplication on mobile • Increase demand in mobile device application • Install with Light Field Camera for fast response • “Shot and Focus” later http://www.augmentedplanet.com/2010/06/the-mobile-augmented-reality-competitive-landscape/
  • 32.
    Representation Hologram VsLight Field a) Depicts the representation of a hologram. b) and c) show two different representations of a light field. Reference: Eurographics 2007/D.Cohen-Or and P.Slavik, A Bidirectional Light Field-Hologram Transform Volume 26 (2007), Number 3
  • 33.
    Light Field Mapping •The possibility to transform a light field into a holographic representation and vice versa. • The holographic data representation is similar to a light field • “M” transforms the light field into a holographic representation. • Method to extract depth from the input light field. • If accurate depth information is available for the light field it can optionally be added to the input Reference: Eurographics 2007/D.Cohen-Or and P.Slavik, A Bidirectional Light Field-Hologram Transform Volume 26 (2007), Number 3
  • 34.
    Light Field Mapping •Depth Reconstruction from Light Fields • Effects of Loss of Data from Hologram Reference: Eurographics 2007/D.Cohen-Or and P.Slavik, A Bidirectional Light Field-Hologram Transform Volume 26 (2007), Number 3
  • 35.
    Hologram Vs LightField Reference: Eurographics 2007/D.Cohen-Or and P.Slavik, A Bidirectional Light Field-Hologram Transform Volume 26 (2007), Number 3
  • 36.
    Hologram Vs LightField • Hologram - an illustration of direct output of holographic content on future generation holographic displays. • Light field - far more efficient for conventional 2D frame buffer displays. • Light Field - the versatility and the power of transformation on synthetic light fields, real light fields and digitally recorded holograms. • The rendered images can be evaluated directly from the holographic representation or through light field rendering. • Light field capable of simulating different aperture sizes as well as focal length – Versatile displays Future • Holograms can be captured using a light field camera • Take advantage of the realism and detail preserving benefits of a real light field while giving the possibility of a 3D output on a holographic screen
  • 37.
    AR Light FieldPossible Improvements • See-through displays • New tracking sensors • Interfaces and Interactions
  • 38.
  • 39.
  • 40.
    Light-Field Telescope Source: JonathanWedd,Jan van der Laan, Eric Lavelle, and David Stoker. “A High-Magnification Light-Field Telescope forExtended Depth-of-Field Biometric Imaging”
  • 41.
    Light-Field Telescope  Takethe image faster and refocus the image after taking  Increase magnification  Achieve large depth of field and high lateral image resolution simultaneously  Capture different wavelengths
  • 42.
    Light Field Microscope A compact Light Field Microscope Designed by Stanford  Consists of an ordinary research microscope and cooled scientific camera  A microlens array is inserted Source: Marc Levoy, Ren Ng, Andrew Adams, Matthew Footer, Mark Horowitz. “Light Field Microscopy”. Stanford University
  • 43.
    Light Field Microscope Source:Marc Levoy, Ren Ng, Andrew Adams, Matthew Footer, Mark Horowitz. “Light Field Microscopy”. Stanford University
  • 44.
    Light Field Microscope Captures light fields of biological specimens in a single snapshot  Offers 3D functional imaging of neuronal activity in entire organisms at single cell level  Separates image acquisition from the selection of viewpoint and focus  Captures video of high speed moving specimens
  • 45.
  • 46.
    Autonomous Vehicle Applications Consumer Passenger Vehicles AgriculturalVehicles Military Combat Vehicles Logistics/Supplies Search & Rescue Vehicles M.Bellone, et al, “Unevenness Point Descriptor for Terrain Analysis in Mobile Robot Applications”, 2012 http://cdn.intechopen.com/pdfs-wm/45459.pdf
  • 47.
    LFC Vs. LaserBased Systems Advantages: Easier data interpretation Significantly Lower Cost Lower Power Requirement Comparable performance Disadvantages: Smaller field of view (multiple cameras required) D. Stavens, “LEARNING TO DRIVE: PERCEPTION FOR AUTONOMOUS CARS”, 2011 http://purl.stanford.edu/pb661px9942
  • 48.
    Light Field DepthMap M.Tao, et al, “Depth from Combining Defocus and Correspondence Using Light-Field Cameras”, 2013 http://www.cs.berkeley.edu/~ravir/lightfield_ICCV.pdf
  • 49.
    Light Field TerrainAnalysis M.Bellone, et al, “Unevenness Point Descriptor for Terrain Analysis in Mobile Robot Applications”, 2012 http://cdn.intechopen.com/pdfs-wm/45459.pdf
  • 50.
    Light Field TerrainAnalysis M.Bellone, et al, “Unevenness Point Descriptor for Terrain Analysis in Mobile Robot Applications”, 2012 http://cdn.intechopen.com/pdfs-wm/45459.pdf
  • 51.
    Future of LightField We have analyzed advantages of LF We showed applications of LF in cameras, mobile phones, 3D scanning, AR, scientific research, and autonomous vehicles Future applications of LF will enable us to do more simple and fast 3D applications