Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our User Agreement and Privacy Policy.

Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our Privacy Policy and User Agreement for details.

Successfully reported this slideshow.

Like this presentation? Why not share!

- SIGGRAPH 2012 Computational Plenopt... by Gordon Wetzstein 6990 views
- SIGGRAPH 2012 Computational Plenopt... by Gordon Wetzstein 6997 views
- SIGGRAPH 2012 Computational Display... by Matt Hirsch - MIT... 7633 views
- SIGGRAPH 2012 Computational Display... by Matt Hirsch - MIT... 7187 views
- SIGGRAPH 2012 Computational Plenopt... by Gordon Wetzstein 6802 views
- SIGGRAPH 2012 Computational Plenopt... by Gordon Wetzstein 6857 views

8,208 views

Published on

SIGGRAPH 2012 Computational Plenoptic Imaging Course - 4 Light Fields

No Downloads

Total views

8,208

On SlideShare

0

From Embeds

0

Number of Embeds

4,844

Shares

0

Downloads

124

Comments

0

Likes

3

No embeds

No notes for slide

- 1. Light Field Acquisition Douglas Lanman NVIDIA Research
- 2. Introduction to Light Fields
- 3. The 5D Plenoptic Function P(q, f, l, t)Q: What is the set of all things that one can ever see?A: The Plenoptic Function [Adelson and Bergen 1991] (from plenus, complete or full, and optic)
- 4. The 5D Plenoptic Function P(q, f, l, t)Q: What is the set of all things that one can ever see?A: The Plenoptic Function [Adelson and Bergen 1991] (from plenus, complete or full, and optic)
- 5. The 5D Plenoptic Function P(q, f, l, t, px, py, pz )P(q, f, l, t, px, py, pz ) defines the intensity of light:• as a function of viewpoint• as a function of time• as a function of wavelength
- 6. The 5D Plenoptic Function P(q, f, l, t, px, py, pz )P(q, f, l, t, px, py, pz ) defines the intensity of light:• as a function of viewpoint• as a function of time• as a function of wavelength
- 7. The 5D Plenoptic FunctionLet’s ignore color and time (i.e., these are attributes of rays)… P(q, f, px, py, pz)The plenoptic function is 5D:• 3D position• 2D directionRequire 5D to represent attributes across occlusions
- 8. The 4D Light Field Consider a region free of occluding objects… P(q, f, px, py, pz) The plenoptic function (light field) is 4D • 2D position • 2D direction The space of all lines in a 3D space is 4D[Levoy and Hanrahan 1996; Gortler et al. 1996]
- 9. Position-Angle Parameterization2D position2D direction s q
- 10. Two-Plane Parameterization2D position2D position u s
- 11. Alternative Parameterizations Left: Points on a plane or curved surface and directions leaving each point Center: Pairs of points on the surface of a sphere Right: Pair of points on two different (typically parallel) planes
- 12. Image-Based Rendering [Levoy and Hanrahan 1996] [Gortler et al. 1996] [Carranza et al. 2003]
- 13. Digital Image Refocusing Kodak 16-megapixel sensor 125μ square-sided microlenses[Ng 2005]
- 14. 3D DisplaysParallax Panoramagram 3DTV with Integral Imaging MERL 3DTV [Kanolt 1918] [Okano et al. 1999] [Matusik and Pfister 2004]
- 15. Multiple Sensors
- 16. Static Camera Arrays Stanford Multi-Camera Array Distributed Light Field Camera 125 cameras using custom hardware 64 cameras with distributed rendering [Wilburn et al. 2002, Wilburn et al. 2005] [Yang et al. 2002]
- 17. Temporal Multiplexing
- 18. Controlled Camera or Object Motion Stanford Spherical Gantry Relighting with 4D Incident Light Fields [Levoy and Hanrahan 1996] [Masselus et al. 2003]
- 19. Uncontrolled Camera or Object Motion Unstructured Lumigraph Rendering [Gortler et al. 1996; Buehler et al. 2001]
- 20. Spatial Multiplexing
- 21. Parallax Barriers (Pinhole Arrays) barriersensorSpatially-multiplexed light field capture using masks (i.e., barriers):• Cause severe attenuation long exposures or lower SNR• Impose fixed trade-off between spatial and angular resolution (unless implemented with programmable masks, e.g. LCDs)[Ives 1903]
- 22. Light Field Photograph (Sensor)
- 23. Light Field Photograph (Decoded) looking to the rightlooking up Sample Image[The (New) Stanford Light Field Archive]
- 24. Integral Imaging (“Fly’s Eye” Lenslets)lenslet fsensorSpatially-multiplexed light field capture using lenslets:• Impose fixed trade-off between spatial and angular resolution[Lippmann 1908]
- 25. Modern, Digital ImplementationsDigital Light Field Photography• Hand-held plenoptic camera [Ng et al. 2005]• Heterodyne light field camera [Veeraraghavan et al. 2007]

No public clipboards found for this slide

Be the first to comment