Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

SIGGRAPH 2012 Computational Plenoptic Imaging Course - 4 Light Fields

8,173 views

Published on

SIGGRAPH 2012 Computational Plenoptic Imaging Course - 4 Light Fields

  • Be the first to comment

SIGGRAPH 2012 Computational Plenoptic Imaging Course - 4 Light Fields

  1. 1. Light Field Acquisition Douglas Lanman NVIDIA Research
  2. 2. Introduction to Light Fields
  3. 3. The 5D Plenoptic Function P(q, f, l, t)Q: What is the set of all things that one can ever see?A: The Plenoptic Function [Adelson and Bergen 1991] (from plenus, complete or full, and optic)
  4. 4. The 5D Plenoptic Function P(q, f, l, t)Q: What is the set of all things that one can ever see?A: The Plenoptic Function [Adelson and Bergen 1991] (from plenus, complete or full, and optic)
  5. 5. The 5D Plenoptic Function P(q, f, l, t, px, py, pz )P(q, f, l, t, px, py, pz ) defines the intensity of light:• as a function of viewpoint• as a function of time• as a function of wavelength
  6. 6. The 5D Plenoptic Function P(q, f, l, t, px, py, pz )P(q, f, l, t, px, py, pz ) defines the intensity of light:• as a function of viewpoint• as a function of time• as a function of wavelength
  7. 7. The 5D Plenoptic FunctionLet’s ignore color and time (i.e., these are attributes of rays)… P(q, f, px, py, pz)The plenoptic function is 5D:• 3D position• 2D directionRequire 5D to represent attributes across occlusions
  8. 8. The 4D Light Field Consider a region free of occluding objects… P(q, f, px, py, pz) The plenoptic function (light field) is 4D • 2D position • 2D direction The space of all lines in a 3D space is 4D[Levoy and Hanrahan 1996; Gortler et al. 1996]
  9. 9. Position-Angle Parameterization2D position2D direction s q
  10. 10. Two-Plane Parameterization2D position2D position u s
  11. 11. Alternative Parameterizations Left: Points on a plane or curved surface and directions leaving each point Center: Pairs of points on the surface of a sphere Right: Pair of points on two different (typically parallel) planes
  12. 12. Image-Based Rendering [Levoy and Hanrahan 1996] [Gortler et al. 1996] [Carranza et al. 2003]
  13. 13. Digital Image Refocusing Kodak 16-megapixel sensor 125μ square-sided microlenses[Ng 2005]
  14. 14. 3D DisplaysParallax Panoramagram 3DTV with Integral Imaging MERL 3DTV [Kanolt 1918] [Okano et al. 1999] [Matusik and Pfister 2004]
  15. 15. Multiple Sensors
  16. 16. Static Camera Arrays Stanford Multi-Camera Array Distributed Light Field Camera 125 cameras using custom hardware 64 cameras with distributed rendering [Wilburn et al. 2002, Wilburn et al. 2005] [Yang et al. 2002]
  17. 17. Temporal Multiplexing
  18. 18. Controlled Camera or Object Motion Stanford Spherical Gantry Relighting with 4D Incident Light Fields [Levoy and Hanrahan 1996] [Masselus et al. 2003]
  19. 19. Uncontrolled Camera or Object Motion Unstructured Lumigraph Rendering [Gortler et al. 1996; Buehler et al. 2001]
  20. 20. Spatial Multiplexing
  21. 21. Parallax Barriers (Pinhole Arrays) barriersensorSpatially-multiplexed light field capture using masks (i.e., barriers):• Cause severe attenuation  long exposures or lower SNR• Impose fixed trade-off between spatial and angular resolution (unless implemented with programmable masks, e.g. LCDs)[Ives 1903]
  22. 22. Light Field Photograph (Sensor)
  23. 23. Light Field Photograph (Decoded) looking to the rightlooking up Sample Image[The (New) Stanford Light Field Archive]
  24. 24. Integral Imaging (“Fly’s Eye” Lenslets)lenslet fsensorSpatially-multiplexed light field capture using lenslets:• Impose fixed trade-off between spatial and angular resolution[Lippmann 1908]
  25. 25. Modern, Digital ImplementationsDigital Light Field Photography• Hand-held plenoptic camera [Ng et al. 2005]• Heterodyne light field camera [Veeraraghavan et al. 2007]

×