Successfully reported this slideshow.
Your SlideShare is downloading. ×

Cameras how they work

Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Loading in …3
×

Check these out next

1 of 72 Ad

More Related Content

Slideshows for you (20)

Similar to Cameras how they work (20)

Advertisement

Recently uploaded (20)

Advertisement

Cameras how they work

  1. 1. Cameras Digital Visual Effects, Spring 2005 Yung-Yu Chuang 2005/3/2 with slides by Brian Curless, Steve Seitz and Alexei Efros
  2. 2. Announcements • Classroom is changed to Room101 • Assignment schedule – Image morphing (3/9-3/30) – Image stitching (3/30-4/20) – Matchmove (4/20-5/11) – Final project (5/11-6/22) • Scribe • Send cyy@csie.ntu.edu.tw to subscribe vfx
  3. 3. Outline • Pinhole camera • Film camera • Digital camera • Video camera • High dynamic range imaging
  4. 4. Camera trial #1 scene film Put a piece of film in front of an object.
  5. 5. Pinhole camera scene film Add a barrier to block off most of the rays. • It reduces blurring • The pinhole is known as the aperture • The image is inverted barrier pinhole camera
  6. 6. Shrinking the aperture Why not make the aperture as small as possible?• Less light gets through • Diffraction effect
  7. 7. Shrinking the aperture
  8. 8. Camera Obscura Drawing from “The Great Art of Light and Shadow “ Jesuit Athanasius Kircher, 1646.
  9. 9. High-end commercial pinhole cameras http://www.bobrigby.com/html/pinhole.html
  10. 10. Adding a lens
  11. 11. Adding a lens scene filmlens “circle of confusion” A lens focuses light onto the film • There is a specific distance at which objects are “in focus” • other points project to a “circle of confusion” in the image
  12. 12. Lenses • Any object point satisfying this equation is in focus • Thin lens applet: http:// www.phy.ntnu.edu.tw/java/Lens/lens_e.html Thin lens equation:
  13. 13. Exposure = aperture + shutter speed • Aperture of diameter D restricts the range of rays (aperture may be on either side of the lens) • Shutter speed is the amount of light is allowed to pass through the aperture F
  14. 14. Aperture • Aperture is usually specified by f-stop, f/D. When a change in f-stop occurs, the light is either doubled or cut in half. • Lower f-stop, more light (larger lens opening) • Higher f-stop, less light (smaller lens opening)
  15. 15. Depth of field Changing the aperture size affects depth of field. A smaller aperture increases the range in which the object is approximately in focus See http://www.photonhead.com/simcam/
  16. 16. Distortion • Radial distortion of the image – Caused by imperfect lenses – Deviations are most noticeable for rays that pass through the edge of the lens No distortion Pin cushion Barrel
  17. 17. Correcting radial distortion from Helmut Dersch
  18. 18. Film camera scene filmlens & motor aperture & shutter
  19. 19. Digital camera scene sensor array lens & motor aperture & shutter • A digital camera replaces film with a sensor array • Each cell in the array is a light-sensitive diode that converts photons to electrons
  20. 20. CCD v.s. CMOS • CCD is less susceptible to noise (special process, higher fill factor) • CMOS is more flexible, less expensive (standard process), less power consumption CCD CMOS
  21. 21. Sensor noise • Blooming • Diffusion • Dark current • Photon shot noise • Amplifier readout noise
  22. 22. Color So far, we’ve only talked about monochrome sensors. Color imaging has been implemented in a number of ways: • Field sequential • Multi-chip • Color filter array • X3 sensor
  23. 23. Field sequential
  24. 24. Field sequential
  25. 25. Field sequential
  26. 26. Prokudin-Gorskii (early 1990’s) Lantern projector http://www.loc.gov/exhibits/empire/
  27. 27. Prokudin-Gorskii (early 1990’s)
  28. 28. Multi-chip wavelength dependent
  29. 29. Embedded color filters Color filters can be manufactured directly onto the photodetectors.
  30. 30. Color filter array Color filter arrays (CFAs)/color filter mosaics Kodak DCS620x
  31. 31. Color filter array Color filter arrays (CFAs)/color filter mosaics Bayer pattern
  32. 32. Bayer’s pattern
  33. 33. Demosaicking CFA’s bilinear interpolation original input linear interpolation
  34. 34. Demosaicking CFA’s Constant hue-based interpolation (Cok) Hue: Interpolate G first
  35. 35. Demosaicking CFA’s Median-based interpolation (Freeman) 1. Linear interpolation 2. Median filter on color differences
  36. 36. Demosaicking CFA’s Median-based interpolation (Freeman) original input linear interpolation color difference median filter reconstruction
  37. 37. Demosaicking CFA’s Gradient-based interpolation (LaRoche-Prescott) 1. Interpolation on G
  38. 38. Demosaicking CFA’s Gradient-based interpolation (LaRoche-Prescott) 2. Interpolation of color differences
  39. 39. Demosaicking CFA’s bilinear Cok Freeman LaRoche
  40. 40. Demosaicking CFA’s Generally, Freeman’s is the best, especially for natural images.
  41. 41. Foveon X3 sensor •light penetrates to different depths for different wavelengths •multilayer CMOS sensor gets 3 different spectral sensitivities
  42. 42. Foveon X3 sensor X3 sensorBayer CFA
  43. 43. Color processing • After color values are recorded, more color processing usually happens: – White balance – Non-linearity to approximate film response or match TV monitor gamma
  44. 44. White Balance automatic white balancewarmer +3
  45. 45. Manual white balance white balance with the white book white balance with the red book
  46. 46. Autofocus • Active – Sonar – Infrared • Passive
  47. 47. Digital camera review website • http://www.dpreview.com/
  48. 48. Camcorder
  49. 49. Interlacing with interlacingwithout interlacing
  50. 50. deinterlacing blend weave
  51. 51. deinterlacing Discard Progressive scan
  52. 52. Hard cases
  53. 53. High dynamic range imaging
  54. 54. Camera pipeline
  55. 55. High dynamic range image
  56. 56. Short exposure 10-6 106 10-6 106 Real world radiance Picture intensity dynamic range Pixel value 0 to 255
  57. 57. Long exposure 10-6 106 10-6 106 Real world radiance Picture intensity dynamic range Pixel value 0 to 255
  58. 58. Real-world response functions
  59. 59. Camera calibration • GeometricGeometric – How pixelHow pixel coordinatescoordinates relate torelate to directionsdirections in thein the worldworld • PhotometricPhotometric – How pixelHow pixel valuesvalues relate torelate to radianceradiance amounts in theamounts in the worldworld
  60. 60. Camera is not a photometer • Limited dynamic rangeLimited dynamic range ⇒ Perhaps use multiple exposures?Perhaps use multiple exposures? • Unknown, nonlinear responseUnknown, nonlinear response ⇒ Not possible to convert pixel values to radianceNot possible to convert pixel values to radiance • Solution:Solution: – Recover response curve from multiple exposures,Recover response curve from multiple exposures, then reconstruct thethen reconstruct the radiance mapradiance map
  61. 61. Varying exposure • Ways to change exposure – Shutter speed – Aperture – Natural density filters
  62. 62. Shutter speed • Note: shutter times usually obey a powerNote: shutter times usually obey a power series – each “stop” is a factor of 2series – each “stop” is a factor of 2 • ¼, 1/8, 1/15, 1/30, 1/60, 1/125, 1/250,¼, 1/8, 1/15, 1/30, 1/60, 1/125, 1/250, 1/500, 1/1000 sec1/500, 1/1000 sec Usually really is:Usually really is: ¼, 1/8, 1/16, 1/32, 1/64, 1/128, 1/256,¼, 1/8, 1/16, 1/32, 1/64, 1/128, 1/256, 1/512, 1/1024 sec1/512, 1/1024 sec
  63. 63. Varying shutter speeds
  64. 64. Algorithm •• 33 •• 11 •• 22 ∆∆t =t = 1 sec1 sec •• 33 •• 11 •• 22 ∆∆t =t = 1/16 sec1/16 sec •• 33 •• 11 •• 22 ∆∆t =t = 4 sec4 sec •• 33 •• 11 •• 22 ∆∆t =t = 1/64 sec1/64 sec •• 33 •• 11 •• 22 ∆∆t =t = 1/4 sec1/4 sec Z=F(exposure) exposure=radiance* ΔtΔt log exposure = log radiance + log Δtlog exposure = log radiance + log Δt
  65. 65. Response curve ln exposureln exposure AssumingAssuming unit radianceunit radiance for each pixelfor each pixel AfterAfter adjusting radiances toadjusting radiances to obtain a smooth responseobtain a smooth response curvecurve PixelvaluePixelvalue 33 11 22 ln exposureln exposure PixelvaluePixelvalue
  66. 66. Results (color film)
  67. 67. Recovered response function
  68. 68. Reconstructed radiance map
  69. 69. What is this for? • Human perception • Vision/graphics applications
  70. 70. Easier HDR reconstruction raw image = 12-bit CCD snapshot
  71. 71. Easier HDR reconstruction Z=F(exposure) exposure=radiance* ΔtΔt log exposure = log radiance + log Δtlog exposure = log radiance + log Δt exposure ΔtΔt
  72. 72. Reference • http://www.howstuffworks.com/digital-camera.htm • http://electronics.howstuffworks.com/autofocus.htm • Ramanath, Snyder, Bilbro, and Sander. Demosaicking Methods for Bayer Color Arrays, Journal of Electronic Imaging, 11(3), pp306- 315. • Paul E. Debevec, Jitendra Malik, Recovering High Dynamic Range Radiance Maps from Photographs, SIGGRAPH 1997. • http://www.worldatwar.org/photos/whitebalance/index.mhtml • http://www.100fps.com/

Editor's Notes

  • ~$300

×