Successfully reported this slideshow.
Upcoming SlideShare
×

# Chap. 10 computational photography

1,040 views

Published on

This is a ppt file for study meetings held in our lab, describing chapter 10 computational photography in the book of Szeliski's "Computer Vision: Algorithms and Applications."
Aizawa-Yamasaki Lab. at The Univ. of Tokyo http://www.hal.t.u-tokyo.ac.jp/

• Full Name
Comment goes here.

Are you sure you want to Yes No
• Be the first to comment

• Be the first to like this

### Chap. 10 computational photography

1. 1. Computer Vision: Algorithms and Applications Chapter 10Computational photography Kazuhiro Sato
2. 2. What is Computational Photography?
3. 3. To enhance or extend thecapabilities of digital photography enhance extend
4. 4. Contents –computationalphotography-1. Photometric calibration2. High dynamic range imaging3. Super-resolution and blur removal4. Image matting and compositing5. Texture analysis and synthesis
5. 5. 1. Photometric calibration
6. 6. imaging pipeline
7. 7. First, we characterize • the mapping functions • the amounts of noise
8. 8. The mapping functions consists of• the radiometric response function• vignetting• the point spread function
9. 9. 1.1 Radiometric ResponseFunction often NOT linear sensor plane optics
10. 10. How can we determine the function? • calibration chart • polynomial approximation • least squares (explained later)
11. 11. 1.2 Noise Level Estimation [Liu et al. ‟08]1. Segment the input image2. Fit a linear function3. Plot the standard deviation pixel value4. Fit a lower envelope estimated noise level
12. 12. 1.3 Vignetting・ Radial Gradient Symmetry [Zheng et al. ‟08]Histogram (asymmetry) 0
13. 13. 1.4 Optical Blur Estimation・ PSF Estimation [Joshi et al. ‟08]
14. 14. Solving a Bayesian frameworkusing a maximum a posteriori (MAP)
15. 15. ・ Recovering the PSF without calibration1. Fit step edges to the elongated ones2. Apply the least squares for every surrounded pixel
16. 16. 2. High Dynamic Range Imaging
17. 17. Dynamic Range 0.01 1 100 10000 Luminance [cd/m2] moonlight indoor lighting sunlight Human eye CCD camera
18. 18. Thenaturalworld istoo brightto becaptured!
19. 19. Creating a properly exposedphoto (High dynamic rangeimaging) different exposures create properly exposed photo
20. 20. How to create such a photo?1. Estimate the radiometric response function2. Estimate a radiance map3. Tone map the resulting HDR image into a 8-bit one
21. 21. 1. Estimate the radiometric response function [Debevec et al. ‟97]
22. 22. 2. Estimate a radiance map [Mitsunaga et al. ‟99] different exposuresMerging the input images into a composite radiancemap.
23. 23. merge differentexposures radiance map (grayscale)
24. 24. 3. Tone map the resulting HDR image into a 8-bit one HDR image 8 bits / pixel 8-bit image
25. 25. 2.1 Tone mapping・ Global tone mapping using a transfer curve [Larson et al. ‟97] This global approach fails to preserve details in regions with widely varying exposures. gamma applied to gamma applied to input HDR image each channel luminance only
26. 26. ・ Local tone mapping using bilateral filter [Durand et al. ‟02]
27. 27. This approach doesn‟t create visible halos around theedges. result with result with low-pass filtering bilateral filtering (visible halos) (no halos)
28. 28. ・ Gradient domain tone mapping [Fattal et al. ‟02]The new luminance is combined with the original colorimage.
29. 29. Attenuation map Tone-mapped result
30. 30. 2.2 Flash photography [Petschnigg et al. 04]Combine flash and non-flash images to achievebetter photos
31. 31. Joint bilateral filter‟s kernel domain kernel range kernel
32. 32. 3. Super-resolution and blur removal
33. 33. 3.1 Color image demosaicing interpolate Bayer RGB pattern Full-color RGB in a camera sensor
34. 34. ・ Bayesian demosaicing with a two-color prior [Bennett et al. ‟06] Original Bilinear [Bennett et al. ‟06]
35. 35. Two-color model [Bennett et al. ‟06]
36. 36. 4. Image matting and compositing
37. 37. Image “matting” and “compositing”? matting compositing Input Alpha-matting Output
38. 38. What is the problem in matting andcompositing?
39. 39. Failed example of matting Input Alpha matte Composite Inset
40. 40. 4.1 Blue screen matting [Smith et al. „96] solve
41. 41. ・ Two-screen matting[Smith et al. ‟96] solve
42. 42. 4.2 Natural image matting Input Hand-drawn trimap Alpha map foreground composite [Chuang et al. ‟01]
43. 43. ・ Knockout[Berman et al. ‟00]
44. 44. ・ Bayesian approach [Chuang et al. ‟01]
45. 45. ・ Comparison of natural image mattings
46. 46. 4.3 Optimization-based matting・ Border matting [Rother et al. ‟04]1. Get a trimap by hard segmentation
47. 47. data term smoothness term
48. 48. ・ Color line (closed-form) matting[Levin et al. ‟08] data term regularization term
49. 49. 4.4 Smoke, shadow, and flashmatting・ Smoke matting [Chuang et al. ‟02] removing the estimated alpha insertion of new input frame foreground matte objects object
50. 50. ・ Shadow matting [Chuang et al. ‟03]
51. 51. 4.5 Video matting [Chuang et al. ‟02]
52. 52. 5. Texture analysis and synthesis
53. 53. Texture synthesis is … synthesize Small patch Similar-looking larger patch
54. 54. Texture synthesis using non-parametric sampling [Efros et al. ‟99]Texture synthesis using non-parametric sampling [Efros et al. ‟01]
55. 55. 5.1 hole filling and inpainting inpaint Original Inpainted result
56. 56. ・ Exemplar-based inpainting [Criminisi et al. ‟04] confidence term data term
57. 57. 3. Update confidence values
58. 58. onion peel [Criminisi et al. ‟04] removedoriginal region Remain the gradient along the region boundary
59. 59. 5.2 Non-photorealistic renderingNon-photorealistic ・ Texture transferrendering usingtexture ・ Imagesynthesis analogies
60. 60. ・ Texture [Efros et al. ‟01]transfer texture transfer input output texture transfer input output
61. 61. 2. Find the minimum error boundary cut
62. 62. ・ Image [Hertzmann et al. ‟01]analogies synthesize this image ?
63. 63. unfiltered example NPR-filtered example target result
64. 64. unfiltered example filtered example target result
65. 65. Thank you!