• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Defocus magnification
 

Defocus magnification

on

  • 1,281 views

Modification of Defocus Effects with a single image without generating an accurate depth map-- Paper Presentation

Modification of Defocus Effects with a single image without generating an accurate depth map-- Paper Presentation

Statistics

Views

Total Views
1,281
Views on SlideShare
1,279
Embed Views
2

Actions

Likes
0
Downloads
15
Comments
0

1 Embed 2

http://www.slashdocs.com 2

Accessibility

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • So, when a scene is captured as an image i.e. a photograph by a camera, some objects of the scene is in focus while, others are out of focus, i.e. in defocus. Going back to the problem definition let us try to get the motivation behind all this effort. We have quite a subjective impression that we view our surroundings in clear, sharp focus. This relates back to the photographic tradition where more or less the complete image remains in focus i.e., have an infinite depth of field. But this contradicts the biological theory that the images that fall on the retina are typically quite badly focused everywhere except within the central fovea. There is a gradient of focus, ranging from nearly perfect focus at the point of regard to almost complete blur at points on distant objects. This gradient of focus inherent in biological and most other optical systems can be treated as a useful source of depth information, and consequently may be used to recover a depth map (i.e., distances between viewer and points in the scene).
  • Defocus map i.e. the measure of blurriness in an image or the blur estimated at each of the edges of an image.
  • The PSF of an optical system is the irradiance distribution that results from a single point source in object space. Although the source may be a point, the image is not. There are two main reasons. First, aberrations in the optical system will spread the image over a finite area. Second, diffraction effects will also spread the image, even in a system that has no aberrations. There is a gradient of focus, ranging from nearly perfect focus at the point of regard to almost complete blur at points on distant objects. This gradient of focus inherent in biological and most other optical systems . The PSF evidently depends on the camera lens properties and atmospheric conditions when the image is captured.
  • Our edge-detection method depends upon making reliable inferences about the local shape of the intensity function at each point in an image. Reliability is defined in terms of an overall significance level α_I for the entire image and a pointwise significance level α_p.that is, the noise at a given point in the image is a normally distributed random variable with standard deviation sn (sn = 2.5), independent of the signal and the noise at other points in the image.The edge-detection method depends upon making reliable inferences about the local shape of the intensity function at each point in an image.
  • Our edge-detection method depends upon making reliable inferences about the local shape of the intensity function at each point in an image. Reliability is defined in terms of an overall significance level α_I for the entire image and a pointwise significance level α_p.A weighted sum of these two filter responses is used to compute the gradient direction θ that maximizes the gradi- ent magnitude.
  • Our edge-detection method depends upon making reliable inferences about the local shape of the intensity function at each point in an image. Reliability is defined in terms of an overall significance level α_I for the entire image and a pointwise significance level α_p.A weighted sum of these two filter responses is used to compute the gradient direction θ that maximizes the gradi- ent magnitude.
  • Our edge-detection method depends upon making reliable inferences about the local shape of the intensity function at each point in an image. Reliability is defined in terms of an overall significance level α_I for the entire image and a pointwise significance level α_p.A weighted sum of these two filter responses is used to compute the gradient direction θ that maximizes the gradi- ent magnitude.
  • Our edge-detection method depends upon making reliable inferences about the local shape of the intensity function at each point in an image. Reliability is defined in terms of an overall significance level α_I for the entire image and a pointwise significance level α_p.

Defocus magnification Defocus magnification Presentation Transcript

  • Defocus Magnification
    SoonminBae & FrédoDurand
    Computer Science and Artificial Intelligence Laboratory
    Massachusetts Institute of Technology
    Proceedings of
    EUROGRAPHICS 2007
    Presented by
    DebaleenaChattopadhyay
  • Presentation Outline
    What?
    - The problem definition
    Why?
    - The Novelty of the paper
    How?
    - The solution to the problem
    Results
    - The outcome
    Discussion
    - The further scope of enhancement
  • The Problem Definition
  • Defocus
    What is Defocus? – It is the result of causing a lens to deviate from accurate focus.
    Depth of focus – While bringing a certain object into focus, objects that are away from it (in focus) appear blurred and the amount of blur increases with the relative distances.
    Defocus and Geometry— This suggests that defocus and geometry (3D orientation of the scene) are related and, therefore, it is possible to estimate the appearance of a scene by measuring the amount of defocus in an image.
    Defocus Magnification— Magnify the defocus effects within an image i.e. to blur blurry regions and keep sharp regions sharp.
  • SLR vs. Point-and-Shoot
    SLR cameras can produce a shallow Depth Of Focus that it keeps the main subject sharp but blurs the background.
    Sharp foreground with blurred background
    Photo Credit: Bae & Durand
  • A Point-and-Shoot Camera
    Small point-and-shoot cameras do not permit enough defocus due to the small diameter of their lens and their small sensors.
    Background is not blurred enough
    Photo Credit: Bae & Durand
  • Defocus and Aperture size
    Bigger aperture produces more defocus
    F-number N gives the aperture diameter A as a fraction of the focal length f (A = Nf )
    Example : f = 100 mm, f/2A = 50mm, f/4 A = 25mm
    f/2
    f/4
    7
    sensor
    lens
    focal plane
    Slide Credit: Bae & Durand
  • Defocus and Sensor size
    Sensor size
    • Small sensor  small lens  less defocus
    • Defocus size is mostly proportional to the sensor size
    Large sensor (22.2 x 14.8), f/2.8
    blurred background
    Small sensor (7.18 x 5.32), f/2.8
    background remained sharp
    Slide Credit: Bae & Durand
  • The Problem Definition
    To present an image- processing technique that
    magnifies existing defocus
    given a single photo.
    (i.e. to simulate shallow depth of field)
    Input Image
    Output Image
  • The Novelty
  • The Novelty
    • Given the problem definition it seems obvious that we need to calculate the depth information and hence decide on which regions are blurry and which are sharp. And then we can make the blurry regions blurrier and sharp regions sharp.
    • A related working domain is estimating shape (3D geometry) from defocus information. This is called Depth from Defocus problem.
    • Depth from Defocus— Calculates the exact depth map. Needs more than one image in different focus settings. Is a hard problem
    • Some related works are:
    [Horn 68; Pentland 87; Darrell 88; Ens 93; Nayar 94; Watanabe 98; Favaro 02; Jin 02; Favaro 05; Hasinoff 06]
  • The Novelty
    • The Novelty of this work lies in the fact that, to modify the
    defocus of an image, the authors—
    • Do not calculate precise depth estimation.
    • Uses a single image in a single focus setting.
    • Do not differentiate between out-of-focus edges and originally smooth edges.
    • Estimate the blur within the image by computing the blur kernel and increase it or propagate it throughout the image.
  • The Solution
  • The Solution
    Overview
    Input Photo
    Defocus Map
    Magnify Defocus
    Blur
    Estimation
    Blur
    Propagation
    Output Photo
    Detect Blurred Edges
    Estimate
    Blur
    Refine Blur
    Estimation
    Cross Bilateral Filtering
    Use Sharpness Bias
  • edge
    gaussian blur
    blurred edge
    The Solution
    Blurred Edge Detection
    Follows Elder & Zucker’sMultiscale Space Edge Detection method.
    [ELDER J. H., ZUCKER S.W.: Local scale control for edge detection and blur estimation. IEEE Transactions on PAMI 20, 7 (1998), 699–716.]
    An edge can be defined as a step function in intensity.
    The blur of this edge (mostly due to the PSF of an optical system ) is modeled as a Gaussian blurring kernel.
    • 15
  • The Solution
    Blurred Edge Detection
    Multi-scale edge detector working formulae :
    The constants and thresholds:
    • Sensor noise n(x, y) is modeled as a stationary, additive, zero-mean white noise process; with standard deviation sn (sn = 2.5),
    • Reliability is defined in terms of an overall significance level αI for the entire image and a pointwise significance level αp. : (αI = 0.0001 %)
  • The Solution
    Blurred Edge Detection
    Multi-scale edge detector working formulae :
    The edge detection scale
    • For each pixel, multiscale responses are computed to the steerable Gaussian first derivative filters and steerable second derivative of Gaussian filters. The gradient direction θ is computed using the steerable Gaussian first derivative basis filters.
    • The filter responses are then tested for reliability using certain thresholds.
    • The right scale for edge detection as defined in the paper is :
    σ1 = {64 32 16 8 4 2 1 0.5} and σ2 = {32 16 8 4 2 1 0.5} pixels
  • The Solution
    Blurred Edge Detection
    Multi-scale edge detector working formulae :
    The Gaussian Derivative filters
    The First Order Gaussian Derivative filter with σ1 varying as
    previously defined scale.
  • The Solution
    Blurred Edge Detection
    Multi-scale edge detector working formulae :
    The Gaussian Derivative filters
    The Second Order Gaussian Derivative filter with σ2 varying as previously defined scale.
  • The Solution
    Blurred Edge Detection
    Multi-scale edge detector working formulae :
    Reliability Criterion detection working formulae :
    Reliability of the filter responses is tested against a threshold which is computed as follows (c1 and c2 for the first and the second order Gaussian derivative filters σ1 and σ2 respectively) :
  • d
    2nd derivative
    The Solution
    Blur Estimation at edges
    • Fit response models of various sizes
    less blurry
    edge
    more blurry
    response model
    • 21
  • The Solution
    Blur Estimation at edges
    Where the response model is given as (σb is the size of the blur kernel) :
  • our blur measure
    input
    The Solution
    Robust Blur Estimation
    Successfully measure the blur size in spite of the influence of scene events nearby
    blurry
    sharp
    23
  • The Solution
    The Blur Measure
    A sparse set (BM)
    • values only at edges
    • Grey means no value
    blurry
    input
    blur measure
    sharp
  • The Solution
    Refinement of Blur Estimation
    Erroneous blur estimates
    due to soft shadows and glossy highlights
    blurry
    input
    blur measure
    sharp
  • The Solution
    Refinement of Blur Estimation
    • Erroneous blur estimates
    • due to soft shadows and glossy highlights
    blurry
    input
    blur measure
    sharp
    26
  • The Solution
    Remove Outliers
    Using cross bilateral filtering [Eisemann 04, Petschnigg 04]
    a weighted mean of neighboring blur measures.
    blurry
    before refinement
    after refinement
    sharp
  • The Solution
    Refine Blur Estimation
    The biased cross bilateral filtering of a sparse set of blur measures, BM at an edge pixel p is formulated as the following:
    Where, b(BM)= exp(-BM/2)
    gσ (x)= exp( -x2/2 σ 2)
    σb = 10% of the image range
    σb = 10% of the image size
  • blur measure
    input
    The Solution
    Blur Propagation
    Given a sparse set of the blur measure (BM)
    Propagate the blur measure to the entire image
    Assumption : blurriness (B)is smooth except at image edges
    Inspired by [Levin et al. 2004]
  • The Solution
    Blur Propagation
    Given a sparse set of the blur measure (BM)
    Propagate the blur measure to the entire image
    Assumption : blurriness (B)is smooth except at image edges
    We minimize
    data term
    smoothness term
    proportional toe -|| C(p) – C(q) ||2
    αp = 0.5 for edge pixels.
    30
  • blur measure
    input
    defocus map
    The Solution
    Blur Propagation
    Edge-preserving propagation
    • propagation stops at input edges
  • The Solution
    Blur the blurry regions
    We use Photoshop Lens Blur to generate results with our defocus map instead of a depth map
  • Recap
    1. User provides a single input photograph
    2. Our system automatically produces the defocus map
    3. We use Photoshop’s lens blur to generate the defocus magnified result
    input
    our defocus map
    Increased defocus
    33
    Slide Credit: Bae & Durand
  • Results
  • Input
    Result
    Defocus Map
    35
    Slide Credit: Bae & Durand
  • 36
    Input
    Result
    Slide Credit: Bae & Durand
  • Input
    Result
    Defocus Map
    37
    Slide Credit: Bae & Durand
  • 38
    Input
    Result
    Slide Credit: Bae & Durand
  • Input
    Result
    Defocus Map
    39
    Slide Credit: Bae & Durand
  • 40
    Input
    Result
    Slide Credit: Bae & Durand
  • Comparison with the ground truth
    ground truth (f/4)
    Input (f/8)
    ourresult
    41
    Slide Credit: Bae & Durand
  • Discussion
  • Future work
    Occlusion boundary
    Video inputs (motion blur)
    Refocusing
    Discussion
    Summary
    Analyze existing defocus
    • multiscale edge detector & fitting
    • non-homogeneous propagation
    Magnify defocus
  • Thank you