Defocus Magnification<br />SoonminBae & FrédoDurand<br />Computer Science and Artificial Intelligence Laboratory<br />Mass...
Presentation Outline<br />What?<br />- The problem definition<br />Why?<br />- The Novelty of the paper<br />How?<br />- T...
The Problem Definition<br />
Defocus<br /> What is Defocus? – It is the result of causing a lens to deviate from accurate  focus. <br /> Depth of focus...
SLR vs. Point-and-Shoot<br />SLR cameras can produce a shallow Depth Of Focus that it keeps the main subject sharp but blu...
A Point-and-Shoot Camera<br />Small point-and-shoot cameras do not permit enough defocus due to the small diameter of thei...
Defocus and Aperture size<br />Bigger aperture produces more defocus<br />F-number N gives the aperture diameter A as a fr...
Defocus and Sensor size<br />Sensor size<br /><ul><li>Small sensor  small lens  less defocus
Defocus size is mostly proportional to the sensor size</li></ul>Large sensor (22.2 x 14.8), f/2.8<br />blurred background<...
The Problem Definition<br />To  present an image- processing technique that <br />magnifies existing defocus <br />given a...
The Novelty<br />
The Novelty<br /><ul><li> 	Given the problem definition it seems obvious that we need to     	calculate the depth informat...
 	A related working domain is estimating shape (3D geometry)       	from defocus information. This is called Depth from De...
      Depth from Defocus— Calculates the exact depth map. Needs     	      more than one image in different focus settings...
 	    Some related works are:</li></ul>	    [Horn 68; Pentland 87; Darrell 88; Ens 93; Nayar 94; 			    Watanabe 98; Favar...
The Novelty<br /><ul><li>    The Novelty of this work lies in the fact that, to modify the </li></ul>	defocus of an image,...
 Uses a single image in a single focus setting.
Do not differentiate between out-of-focus edges and originally smooth edges.
 Estimate the blur within the image by computing the blur kernel and increase it or propagate it throughout the image.</li...
The Solution<br />Overview<br />Input Photo<br />Defocus Map<br />Magnify Defocus<br />Blur <br />Estimation<br />Blur <br...
edge<br />gaussian blur<br />blurred edge<br />The Solution<br />Blurred Edge Detection<br />Follows Elder & Zucker’sMulti...
Reliability is defined in terms of an overall significance level αI for the entire image and a pointwise significance leve...
 The filter responses are then tested for reliability using certain thresholds.
The right scale for edge detection as defined in the paper is :</li></ul>	σ1  =  {64 32 16 8 4 2 1 0.5} and  σ2  =  {32 16...
The Solution<br />Blurred Edge Detection<br />Multi-scale edge detector working formulae :<br />The Gaussian Derivative fi...
The Solution<br />Blurred Edge Detection<br />Multi-scale edge detector working formulae :<br />The Gaussian Derivative fi...
The Solution<br />Blurred Edge Detection<br />Multi-scale edge detector working formulae :<br />Reliability Criterion dete...
d<br />2nd derivative<br />The Solution<br />Blur Estimation at edges<br /><ul><li>Fit response models of various sizes</l...
our blur measure<br />input<br />The Solution<br />Robust Blur Estimation<br />Successfully measure the blur size in spite...
The Solution<br />The Blur Measure<br />A sparse set (BM)<br /><ul><li>values only at edges
Grey means no value </li></ul>blurry<br />input<br />blur measure<br />sharp<br />
The Solution<br />Refinement of Blur Estimation<br />Erroneous blur estimates <br />due to soft shadows and glossy highlig...
The Solution<br />Refinement of Blur Estimation<br /><ul><li>Erroneous blur estimates
due to soft shadows and glossy highlights</li></ul>blurry<br />input<br />blur measure<br />sharp<br />26<br />
The Solution<br />Remove Outliers<br />Using cross bilateral filtering [Eisemann 04, Petschnigg 04] <br />a weighted mean ...
The Solution<br />Refine Blur Estimation<br />The biased cross bilateral filtering of a sparse set of blur measures, BM at...
blur measure<br />input<br />The Solution<br />Blur Propagation<br />Given a sparse set of the blur measure (BM)<br />Prop...
Upcoming SlideShare
Loading in …5
×

Defocus magnification

1,471 views

Published on

Modification of Defocus Effects with a single image without generating an accurate depth map-- Paper Presentation

Published in: Career, Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
1,471
On SlideShare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
25
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • So, when a scene is captured as an image i.e. a photograph by a camera, some objects of the scene is in focus while, others are out of focus, i.e. in defocus. Going back to the problem definition let us try to get the motivation behind all this effort. We have quite a subjective impression that we view our surroundings in clear, sharp focus. This relates back to the photographic tradition where more or less the complete image remains in focus i.e., have an infinite depth of field. But this contradicts the biological theory that the images that fall on the retina are typically quite badly focused everywhere except within the central fovea. There is a gradient of focus, ranging from nearly perfect focus at the point of regard to almost complete blur at points on distant objects. This gradient of focus inherent in biological and most other optical systems can be treated as a useful source of depth information, and consequently may be used to recover a depth map (i.e., distances between viewer and points in the scene).
  • Defocus map i.e. the measure of blurriness in an image or the blur estimated at each of the edges of an image.
  • The PSF of an optical system is the irradiance distribution that results from a single point source in object space. Although the source may be a point, the image is not. There are two main reasons. First, aberrations in the optical system will spread the image over a finite area. Second, diffraction effects will also spread the image, even in a system that has no aberrations. There is a gradient of focus, ranging from nearly perfect focus at the point of regard to almost complete blur at points on distant objects. This gradient of focus inherent in biological and most other optical systems . The PSF evidently depends on the camera lens properties and atmospheric conditions when the image is captured.
  • Our edge-detection method depends upon making reliable inferences about the local shape of the intensity function at each point in an image. Reliability is defined in terms of an overall significance level α_I for the entire image and a pointwise significance level α_p.that is, the noise at a given point in the image is a normally distributed random variable with standard deviation sn (sn = 2.5), independent of the signal and the noise at other points in the image.The edge-detection method depends upon making reliable inferences about the local shape of the intensity function at each point in an image.
  • Our edge-detection method depends upon making reliable inferences about the local shape of the intensity function at each point in an image. Reliability is defined in terms of an overall significance level α_I for the entire image and a pointwise significance level α_p.A weighted sum of these two filter responses is used to compute the gradient direction θ that maximizes the gradi- ent magnitude.
  • Our edge-detection method depends upon making reliable inferences about the local shape of the intensity function at each point in an image. Reliability is defined in terms of an overall significance level α_I for the entire image and a pointwise significance level α_p.A weighted sum of these two filter responses is used to compute the gradient direction θ that maximizes the gradi- ent magnitude.
  • Our edge-detection method depends upon making reliable inferences about the local shape of the intensity function at each point in an image. Reliability is defined in terms of an overall significance level α_I for the entire image and a pointwise significance level α_p.A weighted sum of these two filter responses is used to compute the gradient direction θ that maximizes the gradi- ent magnitude.
  • Our edge-detection method depends upon making reliable inferences about the local shape of the intensity function at each point in an image. Reliability is defined in terms of an overall significance level α_I for the entire image and a pointwise significance level α_p.
  • Defocus magnification

    1. 1. Defocus Magnification<br />SoonminBae & FrédoDurand<br />Computer Science and Artificial Intelligence Laboratory<br />Massachusetts Institute of Technology<br />Proceedings of <br />EUROGRAPHICS 2007<br />Presented by<br />DebaleenaChattopadhyay<br />
    2. 2. Presentation Outline<br />What?<br />- The problem definition<br />Why?<br />- The Novelty of the paper<br />How?<br />- The solution to the problem<br />Results<br />- The outcome<br />Discussion<br />- The further scope of enhancement<br />
    3. 3. The Problem Definition<br />
    4. 4. Defocus<br /> What is Defocus? – It is the result of causing a lens to deviate from accurate focus. <br /> Depth of focus – While bringing a certain object into focus, objects that are away from it (in focus) appear blurred and the amount of blur increases with the relative distances. <br />Defocus and Geometry— This suggests that defocus and geometry (3D orientation of the scene) are related and, therefore, it is possible to estimate the appearance of a scene by measuring the amount of defocus in an image. <br />Defocus Magnification— Magnify the defocus effects within an image i.e. to blur blurry regions and keep sharp regions sharp.<br />
    5. 5. SLR vs. Point-and-Shoot<br />SLR cameras can produce a shallow Depth Of Focus that it keeps the main subject sharp but blurs the background.<br />Sharp foreground with blurred background<br />Photo Credit: Bae & Durand<br />
    6. 6. A Point-and-Shoot Camera<br />Small point-and-shoot cameras do not permit enough defocus due to the small diameter of their lens and their small sensors.<br />Background is not blurred enough<br />Photo Credit: Bae & Durand<br />
    7. 7. Defocus and Aperture size<br />Bigger aperture produces more defocus<br />F-number N gives the aperture diameter A as a fraction of the focal length f (A = Nf )<br />Example : f = 100 mm, f/2A = 50mm, f/4 A = 25mm<br />f/2<br />f/4<br />7<br />sensor<br />lens<br />focal plane<br />Slide Credit: Bae & Durand<br />
    8. 8. Defocus and Sensor size<br />Sensor size<br /><ul><li>Small sensor  small lens  less defocus
    9. 9. Defocus size is mostly proportional to the sensor size</li></ul>Large sensor (22.2 x 14.8), f/2.8<br />blurred background<br />Small sensor (7.18 x 5.32), f/2.8<br />background remained sharp<br />Slide Credit: Bae & Durand<br />
    10. 10. The Problem Definition<br />To present an image- processing technique that <br />magnifies existing defocus <br />given a single photo.<br />(i.e. to simulate shallow depth of field)<br />Input Image<br />Output Image<br />
    11. 11. The Novelty<br />
    12. 12. The Novelty<br /><ul><li> Given the problem definition it seems obvious that we need to calculate the depth information and hence decide on which regions are blurry and which are sharp. And then we can make the blurry regions blurrier and sharp regions sharp.
    13. 13. A related working domain is estimating shape (3D geometry) from defocus information. This is called Depth from Defocus problem.
    14. 14. Depth from Defocus— Calculates the exact depth map. Needs more than one image in different focus settings. Is a hard problem
    15. 15. Some related works are:</li></ul> [Horn 68; Pentland 87; Darrell 88; Ens 93; Nayar 94; Watanabe 98; Favaro 02; Jin 02; Favaro 05; Hasinoff 06]<br />
    16. 16. The Novelty<br /><ul><li> The Novelty of this work lies in the fact that, to modify the </li></ul> defocus of an image, the authors—<br /><ul><li> Do not calculate precise depth estimation.
    17. 17. Uses a single image in a single focus setting.
    18. 18. Do not differentiate between out-of-focus edges and originally smooth edges.
    19. 19. Estimate the blur within the image by computing the blur kernel and increase it or propagate it throughout the image.</li></li></ul><li>The Solution<br />
    20. 20. The Solution<br />Overview<br />Input Photo<br />Defocus Map<br />Magnify Defocus<br />Blur <br />Estimation<br />Blur <br />Propagation<br />Output Photo<br />Detect Blurred Edges<br />Estimate <br />Blur<br />Refine Blur <br />Estimation<br />Cross Bilateral Filtering<br />Use Sharpness Bias<br />
    21. 21. edge<br />gaussian blur<br />blurred edge<br />The Solution<br />Blurred Edge Detection<br />Follows Elder & Zucker’sMultiscale Space Edge Detection method.<br />[ELDER J. H., ZUCKER S.W.: Local scale control for edge detection and blur estimation. IEEE Transactions on PAMI 20, 7 (1998), 699–716.]<br />An edge can be defined as a step function in intensity.<br />The blur of this edge (mostly due to the PSF of an optical system ) is modeled as a Gaussian blurring kernel.<br /><ul><li>15</li></li></ul><li>The Solution<br />Blurred Edge Detection<br />Multi-scale edge detector working formulae :<br />The constants and thresholds:<br /><ul><li> Sensor noise n(x, y) is modeled as a stationary, additive, zero-mean white noise process; with standard deviation sn (sn = 2.5),
    22. 22. Reliability is defined in terms of an overall significance level αI for the entire image and a pointwise significance level αp. : (αI = 0.0001 %)</li></li></ul><li>The Solution<br />Blurred Edge Detection<br />Multi-scale edge detector working formulae :<br />The edge detection scale<br /><ul><li> For each pixel, multiscale responses are computed to the steerable Gaussian first derivative filters and steerable second derivative of Gaussian filters. The gradient direction θ is computed using the steerable Gaussian first derivative basis filters.
    23. 23. The filter responses are then tested for reliability using certain thresholds.
    24. 24. The right scale for edge detection as defined in the paper is :</li></ul> σ1 = {64 32 16 8 4 2 1 0.5} and σ2 = {32 16 8 4 2 1 0.5} pixels<br />
    25. 25. The Solution<br />Blurred Edge Detection<br />Multi-scale edge detector working formulae :<br />The Gaussian Derivative filters<br />The First Order Gaussian Derivative filter with σ1 varying as <br />previously defined scale.<br />
    26. 26. The Solution<br />Blurred Edge Detection<br />Multi-scale edge detector working formulae :<br />The Gaussian Derivative filters<br /> The Second Order Gaussian Derivative filter with σ2 varying as previously defined scale.<br />
    27. 27. The Solution<br />Blurred Edge Detection<br />Multi-scale edge detector working formulae :<br />Reliability Criterion detection working formulae :<br /> Reliability of the filter responses is tested against a threshold which is computed as follows (c1 and c2 for the first and the second order Gaussian derivative filters σ1 and σ2 respectively) :<br />
    28. 28. d<br />2nd derivative<br />The Solution<br />Blur Estimation at edges<br /><ul><li>Fit response models of various sizes</li></ul>less blurry<br />edge<br />more blurry<br />response model<br /><ul><li>21</li></li></ul><li>The Solution<br />Blur Estimation at edges<br />Where the response model is given as (σb is the size of the blur kernel) :<br />
    29. 29. our blur measure<br />input<br />The Solution<br />Robust Blur Estimation<br />Successfully measure the blur size in spite of the influence of scene events nearby<br />blurry<br />sharp<br />23<br />
    30. 30. The Solution<br />The Blur Measure<br />A sparse set (BM)<br /><ul><li>values only at edges
    31. 31. Grey means no value </li></ul>blurry<br />input<br />blur measure<br />sharp<br />
    32. 32. The Solution<br />Refinement of Blur Estimation<br />Erroneous blur estimates <br />due to soft shadows and glossy highlights<br />blurry<br />input<br />blur measure<br />sharp<br />
    33. 33. The Solution<br />Refinement of Blur Estimation<br /><ul><li>Erroneous blur estimates
    34. 34. due to soft shadows and glossy highlights</li></ul>blurry<br />input<br />blur measure<br />sharp<br />26<br />
    35. 35. The Solution<br />Remove Outliers<br />Using cross bilateral filtering [Eisemann 04, Petschnigg 04] <br />a weighted mean of neighboring blur measures.<br />blurry<br />before refinement<br />after refinement<br />sharp<br />
    36. 36. The Solution<br />Refine Blur Estimation<br />The biased cross bilateral filtering of a sparse set of blur measures, BM at an edge pixel p is formulated as the following:<br />Where, b(BM)= exp(-BM/2)<br /> gσ (x)= exp( -x2/2 σ 2)<br />σb = 10% of the image range<br />σb = 10% of the image size<br />
    37. 37. blur measure<br />input<br />The Solution<br />Blur Propagation<br />Given a sparse set of the blur measure (BM)<br />Propagate the blur measure to the entire image<br />Assumption : blurriness (B)is smooth except at image edges<br />Inspired by [Levin et al. 2004]<br />
    38. 38. The Solution<br />Blur Propagation<br />Given a sparse set of the blur measure (BM)<br />Propagate the blur measure to the entire image<br />Assumption : blurriness (B)is smooth except at image edges<br />We minimize<br />data term<br />smoothness term<br />proportional toe -|| C(p) – C(q) ||2<br />αp = 0.5 for edge pixels.<br />30<br />
    39. 39. blur measure<br />input<br />defocus map<br />The Solution<br />Blur Propagation<br />Edge-preserving propagation<br /><ul><li>propagation stops at input edges </li></li></ul><li>The Solution<br />Blur the blurry regions<br /> We use Photoshop Lens Blur to generate results with our defocus map instead of a depth map<br />
    40. 40. Recap<br />1. User provides a single input photograph<br />2. Our system automatically produces the defocus map<br />3. We use Photoshop’s lens blur to generate the defocus magnified result<br />input<br />our defocus map<br />Increased defocus<br />33<br />Slide Credit: Bae & Durand<br />
    41. 41. Results<br />
    42. 42. Input<br />Result<br />Defocus Map<br />35<br />Slide Credit: Bae & Durand<br />
    43. 43. 36<br />Input<br />Result<br />Slide Credit: Bae & Durand<br />
    44. 44. Input<br />Result<br />Defocus Map<br />37<br />Slide Credit: Bae & Durand<br />
    45. 45. 38<br />Input<br />Result<br />Slide Credit: Bae & Durand<br />
    46. 46. Input<br />Result<br />Defocus Map<br />39<br />Slide Credit: Bae & Durand<br />
    47. 47. 40<br />Input<br />Result<br />Slide Credit: Bae & Durand<br />
    48. 48. Comparison with the ground truth<br />ground truth (f/4)<br />Input (f/8)<br />ourresult<br />41<br />Slide Credit: Bae & Durand<br />
    49. 49. Discussion<br />
    50. 50. Future work<br />Occlusion boundary <br />Video inputs (motion blur)<br />Refocusing <br />Discussion<br />Summary<br />Analyze existing defocus<br /><ul><li>multiscale edge detector & fitting
    51. 51. non-homogeneous propagation</li></ul>Magnify defocus<br />
    52. 52. Thank you<br />

    ×