SlideShare a Scribd company logo
ECE 285
Image and video restoration
Chapter II – Basics of filtering I
Charles Deledalle
March 11, 2019
1
Basics of filtering
Definition (Collins dictionary)
filter, noun: any electronic, optical, or acoustic device that blocks signals or
radiations of certain frequencies while allowing others to pass.
2
Basics of filtering
Definition (Collins dictionary)
filter, noun: any electronic, optical, or acoustic device that blocks signals or
radiations of certain frequencies while allowing others to pass.
Refers to the direct model (observation/sensing filter)
y = Hx
• y: observed image
• x: image of interest
H is a linear filter, may act only on frequencies (e.g., blurs) or may not, but
can only remove information (e.g., inpainting).
(a) Unknown image x
H
−→
(b) Observation y
2
Basics of filtering
Definition (Oxford dictionary)
filter, noun: a function used to alter the overall appearance of an image in a
specific manner: ‘many other apps also offer filters for enhancing photos’
3
Basics of filtering
Definition (Oxford dictionary)
filter, noun: a function used to alter the overall appearance of an image in a
specific manner: ‘many other apps also offer filters for enhancing photos’
Refers to the inversion model (restoration filter)
ˆx = ψ(y)
• y: observed image
• ˆx: estimate of x
ψ is a filter, linear or non-linear, that may act only on frequencies or may not,
and usually attempts to add information.
(a) Observation y
ψ
−→
(b) Estimate ˆx
3
Basics of filtering
Action of filters
Perform punctual, local and/or global transformations of pixel values
Punctual:
New pixel value depends on
the input one only
e.g., change of contrast
Local:
New pixel value depends on
the surrounding input pixels
e.g., averaging/convolutions
Global:
New pixel value depends on
the whole input image
e.g., sigma filter
4
Basics of filtering
Filters
• Often one of the first steps in a processing pipeline,
• Goal: improve, simplify, denoise, deblur, detect objects...
CVpipeline
Source: Mike Thompson
5
Basics of filtering
Improve/denoise/detect
6
Basics of filtering
Improve/denoise/detect
Fibroblast cells and microbreads (fluorescence microscopy)
Source: F. Luisier & C. Vonesch
7
Basics of filtering
Improve/denoise/detect
Foreground/Background separation
Source: H. Jiang, et al.
8
Basics of filtering
Standard filters
Two main approaches:
• Spatial domain: use the pixel grid / spatial neighborhoods
• Spectral domain: use Fourier transform, cosine transform, . . .
Spatial Spectral
9
Spatial filtering
Spatial filtering – Local filters
Local / Neighboring filters
• Combine/select values of y in the neighborhood Ni,j of pixel (i, j)
• Following examples: moving average filters, derivative filters, median filters
10
Spatial filtering – Moving average
Moving average
ˆxi,j =
1
Card(N)
(k,l)∈Ni,j
yk,l
Examples:
• Boxcar filter: Ni,j = {(k, l) ; |i − k| τ and |j − l| τ}
• Diskcar filter: Ni,j = (k, l) ; |i − k|2
+ |j − l|2
τ2
3 × 3 boxcar filter
ˆxi,j =
1
9
i+1
k=i−1
j+1
l=j−1
yk,l Parameters:
• Size: 3 × 3, 5 × 5, . . .
• Shape: square, disk
• Centered or not
11
Spatial filtering – Moving average
Moving average
ˆxi,j =
1
Card(N)
(k,l)∈Ni,j
yk,l or ˆxi,j =
1
Card(N)
(k,l)∈N
yi+k,j+l
N = N0,0Examples:
• Boxcar filter: Ni,j = {(k, l) ; |i − k| τ and |j − l| τ}
• Diskcar filter: Ni,j = (k, l) ; |i − k|2
+ |j − l|2
τ2
3 × 3 boxcar filter
ˆxi,j =
1
9
i+1
k=i−1
j+1
l=j−1
yk,l
or
ˆxi,j =
1
9
1
k=−1
+1
l=−1
yi+k,j+l
Parameters:
• Size: 3 × 3, 5 × 5, . . .
• Shape: square, disk
• Centered or not
11
Spatial filtering – Moving average
Moving weighted average
ˆxi,j =
(k,l)∈Z2
wk,lyi+k,j+l
(k,l)∈Z2
wk,l
• Neighboring filter: wi,j =
1 if (i, j) ∈ N
0 otherwise
• Gaussian kernel: wi,j = exp −i2
+j2
2τ2
• Exponential kernel: wi,j = exp −
√
i2+j2
τ
12
Spatial filtering – Moving average
• Rewrite ˆx as a function of s = (i, j), and let δ = (k, l) and t = s + δ
ˆx(s) =
δ∈Z2
w(δ)y(s + δ)
δ∈Z2
w(δ)
=
t∈Z2
w(t − s)y(t)
t∈Z2
w(t − s
δ
)
Local average filter
• Weights are functions of the distance between t and s (length of δ) as
w(t − s) = ϕ(length(t − s))
• ϕ : R+
→ R: kernel function ( = convolution kernel)
• Often, ϕ satisfies



• ϕ(0) = 1,
• lim
α→∞
ϕ(α) = 0,
• ϕ non-increasing: α > β ⇒ ϕ(α) ϕ(β).
13
Spatial filtering – Moving average
Example
• Box filter
ϕ(α) =
1 if α τ
0 otherwise
and length(δ) = ||δ||∞
• Disk filter
ϕ(α) =
1 if α τ
0 otherwise
and length(δ) = ||δ||2
• Gaussian filter
ϕ(α) = exp −
α2
2τ2
and length(δ) = ||δ||2
• Exponential filter
ϕ(α) = exp −
α
τ
and length(δ) = ||δ||2
Reminder:
||v||p =
d
k=1
vp
k
1/p
14
Spatial filtering – Moving average
• ϕ often depends on (at least) one parameter τ
• τ controls the amount of filtering
• τ → 0: no filtering (output = input)
• τ → ∞: average everything in the same proportion
(output = constant signal)
15
Spatial filtering – Moving average
• ϕ often depends on (at least) one parameter τ
• τ controls the amount of filtering
• τ → 0: no filtering (output = input)
• τ → ∞: average everything in the same proportion
(output = constant signal)
What would provide ϕ(α) =
1 if α τ
0 otherwise
and length(δ) = ||δ||1?
d: dimension (d = 2 for pictures, d = 3 for videos, . . . )
15
Spatial filtering – Moving average
Boxfilter
(a) τ = 1 (b) τ = 20 (c) τ = 40 (d) τ = 103
Diamondfilter
(e) τ = 1 (f) τ = 20 (g) τ = 40 (h) τ = 103
Diskfilter
(i) τ = 1 (j) τ = 20 (k) τ = 40 (l) τ = 103
16
Spatial filtering
How to express anisotropy?
• ||e1||2 = 1
• ||e2||2 = 1
• e1, e2 = 0
length(δ) =
17
Spatial filtering
How to express anisotropy?
• ||e1||2 = 1
• ||e2||2 = 1
• e1, e2 = 0
length(δ) =
√
δT Σ−1δ where Σ = e1 e2
λ2
1 0
0 λ2
2
eT
1
eT
2
eigen-decomposition
17
Spatial filtering
How to express anisotropy?
• ||e1||2 = 1
• ||e2||2 = 1
• e1, e2 = 0
length(δ) =
√
δT Σ−1δ where Σ = e1 e2
λ2
1 0
0 λ2
2
eT
1
eT
2
eigen-decomposition
= ||SRδ||2 where SR =
λ−1
1 0
0 λ−1
2
S: scaling
eT
1
eT
2
R: rotation
17
Spatial filtering
How to express anisotropy?
• ||e1||2 = 1
• ||e2||2 = 1
• e1, e2 = 0
length(δ) =
√
δT Σ−1δ where Σ = e1 e2
λ2
1 0
0 λ2
2
eT
1
eT
2
eigen-decomposition
= ||SRδ||2 where SR =
λ−1
1 0
0 λ−1
2
S: scaling
eT
1
eT
2
R: rotation
indeed, e1 =
cos(θ)
sin(θ)
, e2 =
− sin(θ)
cos(θ)
i.e. R =
cos(θ) sin(θ)
− sin(θ) cos(θ)
rotation of −θ 17
Spatial filtering – Moving average
(a) y (b) ˆx, θ = 0° (c) θ = 26° (d) θ = 51°
(e) θ = 77° (f) θ = 103° (g) θ = 129° (h) θ = 154°
18
Spatial filtering – Moving average for denoising
Moving average for denoising?
Figure 1 – (left) Gaussian noise σ = 10. (right) Gaussian filter τ = 3.
19
Spatial filtering – Moving average for denoising
Moving average for denoising?
Figure 1 – (left) Gaussian noise σ = 30. (right) Gaussian filter τ = 5.
19
Spatial filtering – Moving average for denoising
Input image Boxcar filter Gaussian filter
• Boxcar: oscillations/artifacts in vertical and horizontal directions
• Gaussian: no artifacts
• Moving average: reduce noise ,
but loss of resolution, blurry aspect, remove edges
20
Spatial filtering – Moving average for denoising
Image blur ⇒ No more edges ⇒ Structure destruction
⇒ Reduction of image quality
21
Spatial filtering – Moving average for denoising
Image blur ⇒ No more edges ⇒ Structure destruction
⇒ Reduction of image quality
What is an edge?
21
Spatial filtering – Edges
Edges?
• Separation between objects, important parts of the image
• Necessary for vision in order to reconstruct objects
22
Spatial filtering – Edges
Edge: More or less brutal change of intensity
23
Spatial filtering – Edges
• no edges ≡ no objects in the image
• abrupt change ⇒ gap between intensities ⇒ large derivative
24
Spatial filtering – Derivative filters
How to detect edges?
• Look at the derivative
• How? Use derivative filters
• What? Filters that behave somehow as the derivative of real functions
How to design such filters?
25
Spatial filtering – Derivative filters
Derivative of 1d signals
• Derivative of a function x : R → R, if exists, is:
x (t) = lim
h→0
x(t + h) − x(t)
h
or lim
h→0
x(t) − x(t − h)
h
or lim
h→0
x(t + h) − x(t − h)
2h
equivalent definitions
• For a 1d discrete signal, finite differences are
xk = xk+1 − xk
Forward
xk = xk − xk−1
Backward
xk =
xk+1 − xk−1
2
Centered
26
Spatial filtering – Derivative filters
Derivative of 1d signals
• Can be written as a filter
xi =
+1
k=−1
κkyi+k, with
κ = (0, −1, 1)
Forward
κ = (−1, 1, 0)
Backward
κ = (−1
2
, 0, 1
2
)
Centered
27
Spatial filtering – Derivative filters
Derivative of 2d signals
• Gradient of a function x : R2
→ R, if exists, is:
x =



∂x
∂s1
∂x
∂s2



with
∂x
∂s1
(s1, s2) = lim
h→0
x(s1 + h, s2) − x(s1, s2)
h
∂x
∂s2
(s1, s2) = lim
h→0
x(s1, s2 + h) − x(s1, s2)
h
28
Spatial filtering – Derivative filters
Derivative of 2d signals
• Gradient for a 2d discrete signal: finite differences in each direction
( 1x)i,j =
+1
k=−1
+1
l=−1
(κ1)k,lyi+k,j+l
( 2x)i,j =
+1
k=−1
+1
l=−1
(κ2)k,lyi+k,j+l
κ1 =



0 0 0
0 −1 0
0 1 0



κ2 =



0 0 0
0 −1 1
0 0 0



Forward
κ1 =



0 −1 0
0 1 0
0 0 0



κ2 =



0 0 0
−1 1 0
0 0 0



Backward
κ1 =



0 −1
2
0
0 0 0
0 1
2
0



κ2 =



0 0 0
−1
2
0 1
2
0 0 0



Centered
29
Spatial filtering – Derivative filters
Second order derivative of 1d signals
• Second order derivative of a function x : R → R, if exists, is:
x (t) = lim
h→0
x(t − h) − 2x(t) + x(t + h)
h2
• For a 1d discrete signal: xk = xk−1 − 2xk + xx+2
• Corresponding filter: h = (1, −2, 1)
Laplacian of 2d signals
• Laplacian of a function x : R2
→ R, if exists, is:
∆x =
∂2
x
∂s2
1
+
∂2
x
∂s2
2
• For a 2d discrete signal: xi,j = xi−1,j + xi,j−1 − 4xi,j + xi+1,j + xi,j+1
• Corresponding filter: h =



0 1 0
1 −4 1
0 1 0


 =



1
−2
1


 + 1 −2 1
30
Spatial filtering – Derivative filters
(a) x (b) 1x (c) 2x (d) ∆x
(e) x (f) 1x (g) 2x (h) ∆x
Derivative filters detect edges but are sensitive to noise
31
Spatial filtering – Derivative filters
Other derivative filters
• Roberts cross operator (1963)
κ =
+1 0
0 −1
and κ =
0 +1
−1 0
• Sobel operator (1968)
κ1 =



−1 −2 −1
0 0 0
1 2 1


 =



−1
0
1


 1 2 1 and κ2 =



−1 0 1
−2 0 2
−1 0 1


 =



1
2
1


 −1 0 1
• Prewitt operator (1970)
κ1 =



−1 −1 −1
0 0 0
1 1 1


 =



−1
0
1


 1 1 1 and κ2 =



−1 0 1
−1 0 1
−1 0 1


 =



1
1
1


 −1 0 1
32
Spatial filtering – Derivative filters
Edge detection
Based on the norm (and angle) of the discrete approximation of the gradient
|| x||k = ( 1x)2
k + ( 2x)2
k and (∠ x)k = atan2(( 2x)k, ( 1x)k)
33
Spatial filtering – Derivative filters
(a) x (b) 1x (c) 2x (d) || x||
(e) x (f) 1x (Prewitt) (g) 2x (Prewitt) (h) || x|| (Sobel)
Sobel & Prewitt: average in one direction, and differentiate in the other one
⇒ More robust to noise
34
Spatial filtering – Averaging and derivative filters
Comparison between averaging and derivative filters
• Moving average
ˆxi,j =
(k,l)∈Z2
wk,lyi+k,j+l
(k,l)∈Z2
wk,l
=
(k,l)∈Z2
wk,l
(p,q)∈Z2
wp,q
κk,l
yi+k,j+l
=
(k,l)∈Z2
κk,lyi+k,j+l with
(k,l)∈Z2
κk,l = 1 (preserve mean)
• Derivative filter
ˆxi,j =
(k,l)∈Z2
κk,lyi+k,j+l with
(k,l)∈Z2
κk,l = 0 (remove mean)
• They share the same expression
Do all filters have such an expression? 35
Spatial filtering – Linear translation-invariant filters
No, only linear translation-invariant (LTI) filters
Let ψ satisfying
1 Linearity ψ(ax + by) = aψ(x) + bψ(y)
2 Translation-invariance ψ(yτ
) = ψ(y)τ
where xτ
(s) = x(s + τ)
Then, there exist coefficients κk,l such that
ψ(y)i,j =
(k,l)∈Z2
κk,lyi+k,j+l
The reciprocal holds true
Note: Translation-invariant = Shift-invariant = Stationary
= Same weighting applied everywhere
= Identical behavior on identical structures, whatever their location
36
Spatial filtering – Linear translation-invariant filters
Linear translation-invariant filters
ˆxi,j = ψ(y)i,j =
(k,l)∈Z2
κk,lyi+k,j+l
• Weighted average filters:
κk,l = 1
Ex.: Box, Gaussian, Exponential, . . .
• Derivative filters:
κk,l = 0
Ex.: Laplacian, Sobel, Roberts, . . .
37
Spatial filtering – Linear translation-invariant filters
LTI filter ≡ Moving weighted sum ≡ Cross-correlation ≡ Convolution
ˆxi,j =
(k,l)∈Z2
κ∗
k,lyi+k,j+l = κ y (for κ complex)
=
(k,l)∈Z2
νk,lyi−k,j−l = ν ∗ y where νk,l = κ∗
−k,−l
ν called convolution kernel (impulse response of the filter)
38
Spatial filtering – LTI filters and convolution
Properties of the convolution product
• Linear f ∗ (αg + βh) = α(f ∗ g) + β(f ∗ h)
• Commutative f ∗ g = g ∗ f
• Associative f ∗ (g ∗ h) = (f ∗ g) ∗ h
• Separable
h = h1 ∗ h2 ∗ . . . ∗ hp
⇒ f ∗ h = (((f ∗ h1) ∗ h2) . . . ∗ hp)
39
Spatial filtering – LTI filters and convolution
• Directional separability of (isotrope) Gaussians:
G2d
τ = G1d horizontal
τ ∗ G1d vertical
τ
40
Spatial filtering – LTI filters and convolution
Directional separability of Gaussians.
(y ∗ G2d
τ )i,j =
1
Z
∞
k=−∞
∞
l=−∞
exp −
k2
+ l2
2τ2
yi−k,j−l
≈
1
Z
q
k=−q
q
l=−q
exp −
k2
+ l2
2τ2
yi−k,j−l
Restriction to a s × s window, s = 2q + 1
(Complexity O(s2
n1n2)
≈
1
Z
q
k=−q
exp −
k2
2τ2
q
l=−q
exp −
l2
2τ2
yi−k,j−l
∝(y∗G1d horizontal)i−k,j
∝(y∗G1d horizontal)∗G1d vertical
(Complexity O( ))
41
Spatial filtering – LTI filters and convolution
Directional separability of Gaussians.
(y ∗ G2d
τ )i,j =
1
Z
∞
k=−∞
∞
l=−∞
exp −
k2
+ l2
2τ2
yi−k,j−l
≈
1
Z
q
k=−q
q
l=−q
exp −
k2
+ l2
2τ2
yi−k,j−l
Restriction to a s × s window, s = 2q + 1
(Complexity O(s2
n1n2)
≈
1
Z
q
k=−q
exp −
k2
2τ2
q
l=−q
exp −
l2
2τ2
yi−k,j−l
∝(y∗G1d horizontal)i−k,j
∝(y∗G1d horizontal)∗G1d vertical
(Complexity O(sn1n2))
41
Spatial filtering – LTI filters and convolution
• Multi-scale separability of Gaussians: (Continuous case)
Gτ2
1
∗ Gτ2
2
= Gτ2
1 +τ2
2
42
Spatial filtering – LTI filters and convolution
• Separability of Derivatives of Gaussian (DoG): (Continuous case)
Gτ ∗ f =
∂Gτ
∂s
∗ f = Gτ ∗
∂f
∂s
43
Spatial filtering – LTI filters and convolution
Separability of other LTI filters
Directional sep. Multi-scale sep.
Gaussian filter
√
(↓ ∗ →)
√
Exponential filter
Box filter
Disk filter
Diamond filter
Laplacian -
Sobel -
Prewitt -
44
Spatial filtering – LTI filters and convolution
Separability of other LTI filters
Directional sep. Multi-scale sep.
Gaussian filter
√
(↓ ∗ →)
√
Exponential filter x x
Box filter
√
(↓ ∗ →) x
Disk filter x x
Diamond filter x x
Laplacian
√
( (↓ + →)) -
Sobel
√
(↓ ∗ →) -
Prewitt
√
(↓ ∗ →) -
44
Spatial filtering – LTI and linear algebra
LTI filters can be written as a matrix vector product
Functional representation
ˆx(s) =
δ∈Z2
κ∗
(δ)y(s + δ) =
δ∈Z2
ν(δ)y(s − δ)
Vector representation
ˆx = Hy with hi,j = κ∗
(sj − si) = ν(si − sj)
• Vectors represent objects (here: images)
• Matrices represent linear processings (here: convolution)
45
Spatial filtering – LTI and linear algebra
Proof in the periodical case.
• Let δ = sj − si. Assuming periodical boundary conditions, we get
ˆx(si) =
n−1
j=0
κ∗
(sj − si)y(si + (sj − si)) =
n−1
j=0
κ∗
(sj − si)y(sj)
• Let hi,j = κ∗(sj − si), ˆxi = y(si) and yj = y(sj):
ˆxi =
n−1
j=0
hi,jyj
• Define the matrix H = (hi,j), then ˆx = Hy.
46
Spatial filtering – LTI and linear algebra
What does H look like?
1d periodical case
• In 1d, LTI filter stands for linear time invariant filters and reads
ˆx(ti) =
n−1
j=0
ν(ti − tj)y(tj)
• Consider ti − tj = i − j, and let hi,j = ν(ti − tj) = νi−j[n].
• H is a circulant matrix given by
H =













ν0 νn−1 νn−2 . . . ν2 ν1
ν1 ν0 νn−1 νn−2 . . . ν2
...
...
...
νn−1 νn−2 . . . ν2 ν1 ν0













47
Spatial filtering – LTI and linear algebra
What does H look like?
2d periodical case
• In 2d, H is a doubly block circulant matrix given by
Hx =



















































First line Second line
. . .
Last line
ν0,0 ν0,−1 ν0,1 ν−1,0 ν−1,−1 ν−1,1 ν1,0 ν1,−1 ν1,1
ν0,1 ν0,0 ν0,−1 ν−1,1 ν−1,0 ν−1,−1 ν1,1 ν1,0 ν1,−1
...
...
...
...
...
... . . .
...
...
...
...
...
...
...
...
...
...
ν0,−1 ν0,1 ν0,0 ν−1,−1 ν−1,1 ν−1,0 ν1,−1 ν1,1 ν1,0
ν1,0 ν1,−1 ν1,1 ν0,0 ν0,−1 ν0,1 ν2,0 ν2,−1 ν2,1
ν1,1 ν1,0 ν1,−1 ν0,1 ν0,0 ν0,−1 ν2,1 ν2,0 ν2,−1
...
...
...
...
...
... . . .
...
...
...
...
...
...
...
...
...
...
ν1,−1 ν1,1 ν1,0 ν0,−1 ν0,1 ν0,0 ν2,−1 ν2,1 ν2,0
...
...
...
...
ν−1,0 ν−1,−1 ν−1,1 ν−2,0 ν−2,−1 ν−2,1 ν0,0 ν0,−1 ν0,1
ν−1,1 ν−1,0 ν−1,−1 ν−2,1 ν−2,0 ν−2,−1 ν0,1 ν0,0 ν0,−1
...
...
...
...
...
... . . .
...
...
...
...
...
...
...
...
...
...
ν−1,−1 ν−1,1 ν−1,0 ν−2,−1 ν−2,1 ν−2,0 ν0,−1 ν0,1 ν0,0





































































































x0,0
x0,1
...
...
x1,n2−1
x1,0
x1,1
...
...
x1,n2−1
...
xn1−1,0
xn1−1,1
...
...
xn1−1,n2−1


















































48
Spatial filtering – Properties of circulant matrices
Properties of circulant matrices
• Recall that the convolution is commutative: f ∗ g = g ∗ f
⇒ Idem for (doubly block) circulant matrices: H1H2 = H2H1
• Two matrices commute if they have the same eigenvectors
⇒ All circulant matrices share the same eigenvectors
⇒ LTI filters acts in the same eigenspace
What eigenspace is that?
49
Spatial filtering – Properties of circulant matrices
Theorem
• The n eigenvectors, with unit norm, of any circulant matrix H reads as
ek =
1
√
n
1, exp
2πik
n
, exp
4πik
n
, . . . , exp
2(n − 1)πik
n
for k = 0 to n − 1. (Note: here i is the imaginary number)
• Recall that the eigenvectors (ek) with unit norm must satisfy:
Hek = λkek, e∗
kel = 0 if k = l and ||ek||2 = 1
Challenge: try to prove it yourself.
The (ek) form a basis in which LTI filters modulates each element
pointwise. This will be at the heart of Chapter 3.
50
Spatial filtering – LTI filters – Limitations
Limitations of LTI filters
• Derivative filters:
• Detect edges, but
• Sensitive to noise
• Moving average:
• Decrease noise, but
• Do not preserve edges
Difficult object/background separation
LTI filters cannot achieve a good trade-off
in terms of noise vs edge separation
51
Spatial filtering – LTI filters – Limitations
Weak robustness against outliers
Figure 2 – (left) Impulse noise. (center) Gaussian filter τ = 5. (right) τ = 11.
• Even less efficient for impulse noise
• For the best trade-off: structures are lost, noise remains
• Do not adapt to the signal.
Can we achieve better performance by designing an adaptive filter?
52
Adaptive filtering
Spatial filtering – Adaptive filtering
Linear filter ⇒ Non-adaptive filter
• Linear filters are non-adaptive
• The operation does not depend on the signal
Simple, fast implementation
Introduce blur, do not preserve edges
53
Spatial filtering – Adaptive filtering
Linear filter ⇒ Non-adaptive filter
• Linear filters are non-adaptive
• The operation does not depend on the signal
Simple, fast implementation
Introduce blur, do not preserve edges
Adaptive filter ⇒ Non-linear filter
• Adapt the filtering to the content of the image
• Operations/decisions depend on the values of y
• Adaptive ⇒ non-linear:
ψ(αx + βy) = αψ(x) + βψ(y)
Since adapting to x or to y is not the same as adapting to αx + βy.
53
Spatial filtering – Median filter
Median filters
• Try to denoise while respecting main structures
ˆxi,j = median(y(i + k, j + l) | (k, l) ∈ N), N : neighborhood
54
Spatial filtering – Median filter
Behavior of median filters
• Remove isolated points and thin structures
• Preserve (staircase) edges and smooth corners
55
Spatial filtering – Median filter
Figure 3 – (left) Impulse noise. (center) 3 × 3 median filter. (right) 9 × 9.
56
Spatial filtering – Median vs Gaussian
Figure 4 – (left) Impulse noise. (center) 9 × 9 median filter. (right) Gaussian τ = 4.
57
Spatial filtering – Median vs Gaussian
Figure 5 – (left) Gaussian noise. (center) 5 × 5 median filter. (right) Gaussian τ = 3.
58
Spatial filtering – Other standard non-linear filters
Morphological operators
• Erosion
ˆxi,j = min(y(i + k, j + l) | (k, l) ∈ N)
• Dilatation
ˆxi,j = max(y(i + k, j + l) | (k, l) ∈ N)
• N called structural element
Figure 6 – (left) Salt-and-pepper noise, (center) Erosion, (right) Dilatation
59
Spatial filtering – Morphological operators
Figure 7 – (top) Closing, (bottom) Opening. (Source: J.Y. Gil & R. Kimmel)
• Closing: dilatation and next erosion
• Opening: erosion and next dilatation
60
Spatial filtering – Global filtering
Local filter
• The operation depends only on the local neighborhood
• ex: Gaussian filter, median filter
Simple, fast implementation
Do not preserve textures (global context)
Global filter
• Adapt the filtering to the global content of the image
• Result at each pixel may depend on all other pixel values
• Idea: Use non-linearity and global information
61
Spatial filtering – Global average filters
Local average filter
ˆxi =
n
j=1
wi,jyj
n
j=1
wi,j
with wi,j = ϕ(||si − sj||2
2)
weights depend on the distance between pixel positions (linear)
62
Spatial filtering – Global average filters
Sigma filter [Lee, 1981] / Yaroslavsky filter [Yaroslavsky, 1985]
ˆxi =
n
j=1
wi,jyj
n
j=1
wi,j
with wi,j = ϕ(||yi − yj||2
2)
weights depend on the distance between pixel values (non-linear)
Sigma filter: ϕ(α) =
1 if α τ2
0 otherwise
63
Spatial filtering – Sigma filter
50 100 150 200 250
0
50
100
150
200
250
50 100 150 200 250
0
50
100
150
200
250
Figure 8 – Selection of pixel candidates in the sigma filter
64
Spatial filtering – Sigma filter
(a) Noisy image σ = 10 (b) Sigma filter τ = 50 (c) τ = 100
(d) τ = 150 (e) τ = 200 (f) τ = 250
65
Spatial filtering – Sigma filter
Limitations of Sigma filter
Respect edges
Produce a loss of contrast: dull effect
Do not reduce noise as much
Equivalent to a change of histogram:
• each value is mapped to another one
• the mapping depends on the image
(adaptive/non-linear filtering)
0 100 200
0
50
100
150
200
250
=0
= 50
= 100
= 150
= 200
= 250
Naive implementation: O(n2
)
Back to O(n) by using histograms
Idea: apply the sigma filter on moving windows
≡ Mix moving average with sigma filter
66
Spatial filtering – Bilateral filter
Bilateral filter [Tomasi & Manduchi, 1998]
ˆxi =
n
j=1
wi,jyj
n
j=1
wi,j
with wi,j = ϕspace(||si − sj||2
2) × ϕcolor(||yi − yj||2
2)
Weights depend on both the distance
• between pixel positions, and
• between pixel values.
• Consider the influence of space and color,
• Closer positions affects more the average,
• Closer intensities affects more the average.
67
Spatial filtering – Bilateral filter
Properties
• Generalization of moving averages and sigma filters.
• ϕspace(·) = 1: sigma filter
• ϕcolor(·) = 1: moving average
• Spatial constraint: avoid dull effects
• Color constraint: avoid blur effects
68
Spatial filtering – Bilateral filter
50 100 150 200 250
0
50
100
150
200
250
50 100 150 200 250
0
50
100
150
200
250
Figure 9 – Selection of pixel candidates in the bilateral filter
69
Spatial filtering – Bilateral filter
(a) Noisy image σ = 10 (b) Bilateral filter τcolor = 5 (c) τcolor = 20
(d) τcolor = 40 (e) τcolor = 100 (f) τcolor = 200
ϕcolor(α) = exp −
α
2τ2
color
70
Spatial filtering – Bilateral filter
(a) Noisy image σ = 10 (b) Bilateral filter τspace = 5 (c) τspace = 10
(d) τspace = 20 (e) τspace = 50 (f) τspace = ∞
ϕspace(α) =
1 if α τspace2
0 otherwise 71
Spatial filtering – Bilateral vs moving average
Figure 10 – (left) Gaussian noise. (center) Moving average. (right) Bilateral filter.
Bilateral filter
suppressed more noise while respecting the textures
still remaining noises and dull effects
72
Spatial filtering – Bilateral vs moving average
50 100 150 200 250
0
50
100
150
200
250
Why is there remaining noises?
• Below average pixels are mixed with other below average pixels
• Above average pixels are mixed with other above average pixels
Why is there dull effects?
• To counteract the remaining noise effect, τcolor should be large
⇒ different things get mixed up together
What is missing? A more robust way to measure similarity,
but similarity of what exactly?
73
Patches and non-local filters
Spatial filtering – Looking for other views
T noisy observations y(t) Estimation ˆx of the unknown signal x
• Sample averaging of T noisy values:
E[ˆxi] = E
1
T
T
t=1
y
(t)
i =
1
T
T
t=1
E[y
(t)
i ] =
1
T
T
t=1
xi = xi (unbiased)
and Var[ˆxi] = Var
1
T
T
t=1
y
(t)
i =
1
T 2
T
t=1
Var[y
(t)
i ] =
1
T 2
T
t=1
σ
2
=
σ2
T
(reduce noise)
• . . . only if the selected values are iid.
similar = close to be iid
→ How can we select them on a single image? 74
Spatial filtering – Selection-based filtering
General idea
• Goal: estimate the image x from the noisy image y
• Choose a pixel i to denoise
75
Spatial filtering – Selection-based filtering
General idea
• Goal: estimate the image x from the noisy image y
• Choose a pixel i to denoise
• Inspect the pixels j around the pixel of interest i
• Select the suitable candidates j
• Average their values and update the value of i
75
Spatial filtering – Selection-based filtering
General idea
• Goal: estimate the image x from the noisy image y
• Choose a pixel i to denoise
• Inspect the pixels j around the pixel of interest i
• Select the suitable candidates j
• Average their values and update the value of i
75
Spatial filtering – Selection-based filtering
General idea
• Goal: estimate the image x from the noisy image y
• Choose a pixel i to denoise
• Inspect the pixels j around the pixel of interest i
• Select the suitable candidates j
• Average their values and update the value of i
75
Spatial filtering – Selection-based filtering
General idea
• Goal: estimate the image x from the noisy image y
• Choose a pixel i to denoise
• Inspect the pixels j around the pixel of interest i
• Select the suitable candidates j
• Average their values and update the value of i
• Repeat for all pixels i
75
Spatial filtering – Selection-based filtering
General idea
• Goal: estimate the image x from the noisy image y
• Choose a pixel i to denoise
• Inspect the pixels j around the pixel of interest i
• Select the suitable candidates j
• Average their values and update the value of i
• Repeat for all pixels i
75
Spatial filtering – Selection-based filtering
General idea
• Goal: estimate the image x from the noisy image y
• Choose a pixel i to denoise
• Inspect the pixels j around the pixel of interest i
• Select the suitable candidates j
• Average their values and update the value of i
• Repeat for all pixels i
75
Spatial filtering – Selection-based filtering
Selection rules
ˆxi =
j wi,jyj
j wi,j
where
wi,j =
1 if ||si − sj|| τ ← Moving average
0 otherwise
wi,j =
1 if ||yi − yj|| τ ← Sigma filter
0 otherwise
wi,j =
1 if xi = xj ← Oracle
0 otherwise
How to choose suitable pixels j to combine?
75
Spatial filtering – Patches
Definition [Oxford dictionary]
patch (noun): A small area or amount of something.
Image patches: sub-regions of the image
• shape: typically rectangular
• size: much smaller than image size
→ most common use:
square regions between
5 × 5 and 21 × 21 pixels
→ trade-off:
size ⇒ more distinctive/informative
size ⇒ easier to model/learn/match
non-rectangular / deforming shapes:
computationally complexity
patches capture local context: geometry and texture
76
Spatial filtering – Patches for texture synthesis
Copying/pasting similar patches yields impressive texture synthesis:
Texture synthesis method by Efros and Leung (1999)
To generate a new pixel value:
• extract the surrounding patch (yellow)
• find similar patches in the reference image
• randomly pick one of them
• use the value of the central pixel of that patch
77
Spatial filtering – Patches for texture synthesis
Copying/pasting similar patches yields impressive texture synthesis:
Texture synthesis method by Efros and Leung (1999)
5x5 window 11x11 window 15x15 window 23x23 window
77
Spatial filtering – Non-local means
Bilateral filter [Tomasi & Manduchi, 1998]
ˆxi =
j∈Ni
wi,jyj
j∈Ni
wi,j
with wi,j = ϕspace(||si − sj||2
2) × ϕcolor(||yi − yj||2
2)
weights depend on the distance between pixel positions and pixel values
Non-local means [Buades at al, 2005, Awate et al, 2005]
ˆxi =
j∈Ni
wi,jyj
n
j∈Ni
wi,j
with wi,j = ϕ(||Piy − Pjy||2
2)
• Ni: large neighborhood of i, called search window (typically 21 × 21)
• Pi: operator extracting a small window, patch, at i (typically 7 × 7)
weights in a large search windows depend on the distance between patches
78
Spatial filtering – Non-local means
Remarks
The term non-local refers to that disconnected pixels are mixed together.
The Sigma, Yaroslavsky and Bilateral filters are then also non-local.
But Non-Local means
(or NL-means)
always refers to the one using patches.
A similar algorithm was concurrently proposed under the name UINTA.
NL-means [Buades et al, CVPR 2005] UINTA [Awate et al, CVPR 2005]
79
Spatial filtering – Non-local means
Non-local approach [Buades at al, 2005, Awate et al, 2005]
• Local filters: average neighborhood pixels
• Non-local filters: average pixels being in a similar context
ˆxi =
j wi,jyj
j wi,j
Similarity of noise−free values
Similarityofnoise−freepatches
0 10 20 30 40 50
0
5
10
15
20
25
30
35
40
45
50
0
0.02
0.04
0.06
0.08
0.1
0.12
0.14
0.16
0.18
Patches are redundant in most types of images (large noise reduction)
and similar ones tend to share the same underlying noise-free values (unbiasedness)
80
Spatial filtering – Non-local means
Non-local approach [Buades at al, 2005, Awate et al, 2005]
• Local filters: average neighborhood pixels
• Non-local filters: average pixels being in a similar context
ˆxi =
j wi,jyj
j wi,j
wi,j = e
−
||si−sj ||2
2
2τ2
Weighted
average
Search window
Weights map
Noisy image Local approach
81
Spatial filtering – Non-local means
Non-local approach [Buades at al, 2005, Awate et al, 2005]
• Local filters: average neighborhood pixels
• Non-local filters: average pixels being in a similar context
ˆxi =
j wi,jyj
j wi,j
wi,j = e
−
||si−sj ||2
2
2τ2
Weighted
average
Weighted
average
Search window
Weights map
Noisy image Local approach
Weights map
Non-Local approach
Patch comparison
Patch 1 Patch 2
Dissimilar patches
Similar patches
low weights
high weights
wi,j =
e
−
||Piy−Pj y||2
2
2τ2
81
Spatial filtering – Non-local means
Non-local approach [Buades at al, 2005, Awate et al, 2005]
• Local filters: average neighborhood pixels
• Non-local filters: average pixels being in a similar context
ˆxi =
j wi,jyj
j wi,j
wi,j = e
−
||si−sj ||2
2
2τ2
Weighted
average
Weighted
average
Search window
Weights map
Noisy image Local approach
Weights map
Non-Local approach
Patch comparison
Patch 1 Patch 2
Dissimilar patches
Similar patches
low weights
high weights
wi,j =
e
−
||Piy−Pj y||2
2
2τ2
81
Spatial filtering – Non-local means
Example (Map of non-local weights)
Figure 11 – Image extracted from [Buades et al., 2005]
82
Spatial filtering – Non-local means
Goodparameters
(a) Noise-free image (b) Noisy image (c) Good parameters
ToolowparametersToohighparameters
(d) Search window size (e) Patch size (f) Bandwidth τ
Figure 12 – Influence of the three main parameters of the NL means on the solution.
83
Spatial filtering – Non-local means
Limitations of NL-means
Respect edges Remaining noise around rare patches
Good for texture Lose/blur details with low SNR
(a) Noisy image (b) NL-means (c) BM3D
Naive implementation: O(n|N||P|) (∼ 1 minute for 256 × 256 image)
Using integral tables: O(n|N|) (few seconds for 256 × 256 image)
Or FFT: O(n|N| log |N|)
84
Spatial filtering – Extensions of non-local means
More elaborate schemes mostly rely on patches
and use more sophisticated estimators than the average
But we will need to study some more of the basics first...
85
Questions?
Next class: basics of filtering II
Sources and images courtesy of
• L. Condat
• L. Denis
• J.Y. Gil
• H. Jiang
• I. Kokkinos
• R. Kimmel
• F. Luisier
• S. Seitz
• M. Thompson
• V.-T. Ta
• C. Vonesh
• Wikipedia
85

More Related Content

What's hot

Proximal Splitting and Optimal Transport
Proximal Splitting and Optimal TransportProximal Splitting and Optimal Transport
Proximal Splitting and Optimal TransportGabriel Peyré
 
Mesh Processing Course : Mesh Parameterization
Mesh Processing Course : Mesh ParameterizationMesh Processing Course : Mesh Parameterization
Mesh Processing Course : Mesh ParameterizationGabriel Peyré
 
Low Complexity Regularization of Inverse Problems
Low Complexity Regularization of Inverse ProblemsLow Complexity Regularization of Inverse Problems
Low Complexity Regularization of Inverse ProblemsGabriel Peyré
 
Collision Detection In 3D Environments
Collision Detection In 3D EnvironmentsCollision Detection In 3D Environments
Collision Detection In 3D EnvironmentsUng-Su Lee
 
Mesh Processing Course : Geodesic Sampling
Mesh Processing Course : Geodesic SamplingMesh Processing Course : Geodesic Sampling
Mesh Processing Course : Geodesic SamplingGabriel Peyré
 
Geodesic Method in Computer Vision and Graphics
Geodesic Method in Computer Vision and GraphicsGeodesic Method in Computer Vision and Graphics
Geodesic Method in Computer Vision and GraphicsGabriel Peyré
 
Signal Processing Course : Inverse Problems Regularization
Signal Processing Course : Inverse Problems RegularizationSignal Processing Course : Inverse Problems Regularization
Signal Processing Course : Inverse Problems RegularizationGabriel Peyré
 
Learning Sparse Representation
Learning Sparse RepresentationLearning Sparse Representation
Learning Sparse RepresentationGabriel Peyré
 
Journey to structure from motion
Journey to structure from motionJourney to structure from motion
Journey to structure from motionJa-Keoung Koo
 
Machine learning of structured outputs
Machine learning of structured outputsMachine learning of structured outputs
Machine learning of structured outputszukun
 
Low Complexity Regularization of Inverse Problems - Course #3 Proximal Splitt...
Low Complexity Regularization of Inverse Problems - Course #3 Proximal Splitt...Low Complexity Regularization of Inverse Problems - Course #3 Proximal Splitt...
Low Complexity Regularization of Inverse Problems - Course #3 Proximal Splitt...Gabriel Peyré
 
"Fundamentals of Monocular SLAM," a Presentation from Cadence
"Fundamentals of Monocular SLAM," a Presentation from Cadence"Fundamentals of Monocular SLAM," a Presentation from Cadence
"Fundamentals of Monocular SLAM," a Presentation from CadenceEdge AI and Vision Alliance
 
CG OpenGL Shadows + Light + Texture -course 10
CG OpenGL Shadows + Light + Texture -course 10CG OpenGL Shadows + Light + Texture -course 10
CG OpenGL Shadows + Light + Texture -course 10fungfung Chen
 
Model Selection with Piecewise Regular Gauges
Model Selection with Piecewise Regular GaugesModel Selection with Piecewise Regular Gauges
Model Selection with Piecewise Regular GaugesGabriel Peyré
 
Structured regression for efficient object detection
Structured regression for efficient object detectionStructured regression for efficient object detection
Structured regression for efficient object detectionzukun
 
Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...
Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...
Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...Gabriel Peyré
 
Signal Processing Course : Convex Optimization
Signal Processing Course : Convex OptimizationSignal Processing Course : Convex Optimization
Signal Processing Course : Convex OptimizationGabriel Peyré
 
Signal Processing Course : Orthogonal Bases
Signal Processing Course : Orthogonal BasesSignal Processing Course : Orthogonal Bases
Signal Processing Course : Orthogonal BasesGabriel Peyré
 

What's hot (20)

Proximal Splitting and Optimal Transport
Proximal Splitting and Optimal TransportProximal Splitting and Optimal Transport
Proximal Splitting and Optimal Transport
 
Mesh Processing Course : Mesh Parameterization
Mesh Processing Course : Mesh ParameterizationMesh Processing Course : Mesh Parameterization
Mesh Processing Course : Mesh Parameterization
 
Low Complexity Regularization of Inverse Problems
Low Complexity Regularization of Inverse ProblemsLow Complexity Regularization of Inverse Problems
Low Complexity Regularization of Inverse Problems
 
Histogram processing
Histogram processingHistogram processing
Histogram processing
 
Collision Detection In 3D Environments
Collision Detection In 3D EnvironmentsCollision Detection In 3D Environments
Collision Detection In 3D Environments
 
Mesh Processing Course : Geodesic Sampling
Mesh Processing Course : Geodesic SamplingMesh Processing Course : Geodesic Sampling
Mesh Processing Course : Geodesic Sampling
 
Geodesic Method in Computer Vision and Graphics
Geodesic Method in Computer Vision and GraphicsGeodesic Method in Computer Vision and Graphics
Geodesic Method in Computer Vision and Graphics
 
Signal Processing Course : Inverse Problems Regularization
Signal Processing Course : Inverse Problems RegularizationSignal Processing Course : Inverse Problems Regularization
Signal Processing Course : Inverse Problems Regularization
 
Learning Sparse Representation
Learning Sparse RepresentationLearning Sparse Representation
Learning Sparse Representation
 
Journey to structure from motion
Journey to structure from motionJourney to structure from motion
Journey to structure from motion
 
Machine learning of structured outputs
Machine learning of structured outputsMachine learning of structured outputs
Machine learning of structured outputs
 
Low Complexity Regularization of Inverse Problems - Course #3 Proximal Splitt...
Low Complexity Regularization of Inverse Problems - Course #3 Proximal Splitt...Low Complexity Regularization of Inverse Problems - Course #3 Proximal Splitt...
Low Complexity Regularization of Inverse Problems - Course #3 Proximal Splitt...
 
"Fundamentals of Monocular SLAM," a Presentation from Cadence
"Fundamentals of Monocular SLAM," a Presentation from Cadence"Fundamentals of Monocular SLAM," a Presentation from Cadence
"Fundamentals of Monocular SLAM," a Presentation from Cadence
 
CG OpenGL Shadows + Light + Texture -course 10
CG OpenGL Shadows + Light + Texture -course 10CG OpenGL Shadows + Light + Texture -course 10
CG OpenGL Shadows + Light + Texture -course 10
 
Model Selection with Piecewise Regular Gauges
Model Selection with Piecewise Regular GaugesModel Selection with Piecewise Regular Gauges
Model Selection with Piecewise Regular Gauges
 
Structured regression for efficient object detection
Structured regression for efficient object detectionStructured regression for efficient object detection
Structured regression for efficient object detection
 
Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...
Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...
Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...
 
SA09 Realtime education
SA09 Realtime educationSA09 Realtime education
SA09 Realtime education
 
Signal Processing Course : Convex Optimization
Signal Processing Course : Convex OptimizationSignal Processing Course : Convex Optimization
Signal Processing Course : Convex Optimization
 
Signal Processing Course : Orthogonal Bases
Signal Processing Course : Orthogonal BasesSignal Processing Course : Orthogonal Bases
Signal Processing Course : Orthogonal Bases
 

Similar to IVR - Chapter 2 - Basics of filtering I: Spatial filters (25Mb)

IVR - Chapter 6 - Sparsity, shrinkage and wavelets
IVR - Chapter 6 - Sparsity, shrinkage and waveletsIVR - Chapter 6 - Sparsity, shrinkage and wavelets
IVR - Chapter 6 - Sparsity, shrinkage and waveletsCharles Deledalle
 
03 cie552 image_filtering_spatial
03 cie552 image_filtering_spatial03 cie552 image_filtering_spatial
03 cie552 image_filtering_spatialElsayed Hemayed
 
Norm-variation of bilinear averages
Norm-variation of bilinear averagesNorm-variation of bilinear averages
Norm-variation of bilinear averagesVjekoslavKovac1
 
Vector Distance Transform Maps for Autonomous Mobile Robot Navigation
Vector Distance Transform Maps for Autonomous Mobile Robot NavigationVector Distance Transform Maps for Autonomous Mobile Robot Navigation
Vector Distance Transform Maps for Autonomous Mobile Robot NavigationJanindu Arukgoda
 
Bayesian inference on mixtures
Bayesian inference on mixturesBayesian inference on mixtures
Bayesian inference on mixturesChristian Robert
 
EC8553 Discrete time signal processing
EC8553 Discrete time signal processing EC8553 Discrete time signal processing
EC8553 Discrete time signal processing ssuser2797e4
 
Euclidean Distance Matrices: A Short Walk Through Theory, Algorithms and Appl...
Euclidean Distance Matrices: A Short Walk Through Theory, Algorithms and Appl...Euclidean Distance Matrices: A Short Walk Through Theory, Algorithms and Appl...
Euclidean Distance Matrices: A Short Walk Through Theory, Algorithms and Appl...Juri Ranieri
 
VHDL and Cordic Algorithim
VHDL and Cordic AlgorithimVHDL and Cordic Algorithim
VHDL and Cordic AlgorithimSubeer Rangra
 
Elliptical curve cryptography
Elliptical curve cryptographyElliptical curve cryptography
Elliptical curve cryptographyBarani Tharan
 
Spatial Filtering in intro image processingr
Spatial Filtering in intro image processingrSpatial Filtering in intro image processingr
Spatial Filtering in intro image processingrkumarankit06875
 
IVR - Chapter 1 - Introduction
IVR - Chapter 1 - IntroductionIVR - Chapter 1 - Introduction
IVR - Chapter 1 - IntroductionCharles Deledalle
 
Computational Information Geometry on Matrix Manifolds (ICTP 2013)
Computational Information Geometry on Matrix Manifolds (ICTP 2013)Computational Information Geometry on Matrix Manifolds (ICTP 2013)
Computational Information Geometry on Matrix Manifolds (ICTP 2013)Frank Nielsen
 

Similar to IVR - Chapter 2 - Basics of filtering I: Spatial filters (25Mb) (20)

IVR - Chapter 6 - Sparsity, shrinkage and wavelets
IVR - Chapter 6 - Sparsity, shrinkage and waveletsIVR - Chapter 6 - Sparsity, shrinkage and wavelets
IVR - Chapter 6 - Sparsity, shrinkage and wavelets
 
Lecture 8
Lecture 8Lecture 8
Lecture 8
 
03 cie552 image_filtering_spatial
03 cie552 image_filtering_spatial03 cie552 image_filtering_spatial
03 cie552 image_filtering_spatial
 
Norm-variation of bilinear averages
Norm-variation of bilinear averagesNorm-variation of bilinear averages
Norm-variation of bilinear averages
 
Edge detection
Edge detectionEdge detection
Edge detection
 
ECC_basics.ppt
ECC_basics.pptECC_basics.ppt
ECC_basics.ppt
 
main
mainmain
main
 
Klt
KltKlt
Klt
 
ECC_basics.ppt
ECC_basics.pptECC_basics.ppt
ECC_basics.ppt
 
Vector Distance Transform Maps for Autonomous Mobile Robot Navigation
Vector Distance Transform Maps for Autonomous Mobile Robot NavigationVector Distance Transform Maps for Autonomous Mobile Robot Navigation
Vector Distance Transform Maps for Autonomous Mobile Robot Navigation
 
Fourier transform
Fourier transformFourier transform
Fourier transform
 
Bayesian inference on mixtures
Bayesian inference on mixturesBayesian inference on mixtures
Bayesian inference on mixtures
 
EC8553 Discrete time signal processing
EC8553 Discrete time signal processing EC8553 Discrete time signal processing
EC8553 Discrete time signal processing
 
Euclidean Distance Matrices: A Short Walk Through Theory, Algorithms and Appl...
Euclidean Distance Matrices: A Short Walk Through Theory, Algorithms and Appl...Euclidean Distance Matrices: A Short Walk Through Theory, Algorithms and Appl...
Euclidean Distance Matrices: A Short Walk Through Theory, Algorithms and Appl...
 
VHDL and Cordic Algorithim
VHDL and Cordic AlgorithimVHDL and Cordic Algorithim
VHDL and Cordic Algorithim
 
Lec03 light
Lec03 lightLec03 light
Lec03 light
 
Elliptical curve cryptography
Elliptical curve cryptographyElliptical curve cryptography
Elliptical curve cryptography
 
Spatial Filtering in intro image processingr
Spatial Filtering in intro image processingrSpatial Filtering in intro image processingr
Spatial Filtering in intro image processingr
 
IVR - Chapter 1 - Introduction
IVR - Chapter 1 - IntroductionIVR - Chapter 1 - Introduction
IVR - Chapter 1 - Introduction
 
Computational Information Geometry on Matrix Manifolds (ICTP 2013)
Computational Information Geometry on Matrix Manifolds (ICTP 2013)Computational Information Geometry on Matrix Manifolds (ICTP 2013)
Computational Information Geometry on Matrix Manifolds (ICTP 2013)
 

More from Charles Deledalle

MLIP - Chapter 6 - Generation, Super-Resolution, Style transfer
MLIP - Chapter 6 - Generation, Super-Resolution, Style transferMLIP - Chapter 6 - Generation, Super-Resolution, Style transfer
MLIP - Chapter 6 - Generation, Super-Resolution, Style transferCharles Deledalle
 
MLIP - Chapter 5 - Detection, Segmentation, Captioning
MLIP - Chapter 5 - Detection, Segmentation, CaptioningMLIP - Chapter 5 - Detection, Segmentation, Captioning
MLIP - Chapter 5 - Detection, Segmentation, CaptioningCharles Deledalle
 
MLIP - Chapter 4 - Image classification and CNNs
MLIP - Chapter 4 - Image classification and CNNsMLIP - Chapter 4 - Image classification and CNNs
MLIP - Chapter 4 - Image classification and CNNsCharles Deledalle
 
MLIP - Chapter 3 - Introduction to deep learning
MLIP - Chapter 3 - Introduction to deep learningMLIP - Chapter 3 - Introduction to deep learning
MLIP - Chapter 3 - Introduction to deep learningCharles Deledalle
 
MLIP - Chapter 2 - Preliminaries to deep learning
MLIP - Chapter 2 - Preliminaries to deep learningMLIP - Chapter 2 - Preliminaries to deep learning
MLIP - Chapter 2 - Preliminaries to deep learningCharles Deledalle
 
IVR - Chapter 4 - Variational methods
IVR - Chapter 4 - Variational methodsIVR - Chapter 4 - Variational methods
IVR - Chapter 4 - Variational methodsCharles Deledalle
 
IVR - Chapter 5 - Bayesian methods
IVR - Chapter 5 - Bayesian methodsIVR - Chapter 5 - Bayesian methods
IVR - Chapter 5 - Bayesian methodsCharles Deledalle
 
IVR - Chapter 7 - Patch models and dictionary learning
IVR - Chapter 7 - Patch models and dictionary learningIVR - Chapter 7 - Patch models and dictionary learning
IVR - Chapter 7 - Patch models and dictionary learningCharles Deledalle
 
IVR - Chapter 3 - Basics of filtering II: Spectral filters
IVR - Chapter 3 - Basics of filtering II: Spectral filtersIVR - Chapter 3 - Basics of filtering II: Spectral filters
IVR - Chapter 3 - Basics of filtering II: Spectral filtersCharles Deledalle
 

More from Charles Deledalle (10)

Chapter 1 - Introduction
Chapter 1 - IntroductionChapter 1 - Introduction
Chapter 1 - Introduction
 
MLIP - Chapter 6 - Generation, Super-Resolution, Style transfer
MLIP - Chapter 6 - Generation, Super-Resolution, Style transferMLIP - Chapter 6 - Generation, Super-Resolution, Style transfer
MLIP - Chapter 6 - Generation, Super-Resolution, Style transfer
 
MLIP - Chapter 5 - Detection, Segmentation, Captioning
MLIP - Chapter 5 - Detection, Segmentation, CaptioningMLIP - Chapter 5 - Detection, Segmentation, Captioning
MLIP - Chapter 5 - Detection, Segmentation, Captioning
 
MLIP - Chapter 4 - Image classification and CNNs
MLIP - Chapter 4 - Image classification and CNNsMLIP - Chapter 4 - Image classification and CNNs
MLIP - Chapter 4 - Image classification and CNNs
 
MLIP - Chapter 3 - Introduction to deep learning
MLIP - Chapter 3 - Introduction to deep learningMLIP - Chapter 3 - Introduction to deep learning
MLIP - Chapter 3 - Introduction to deep learning
 
MLIP - Chapter 2 - Preliminaries to deep learning
MLIP - Chapter 2 - Preliminaries to deep learningMLIP - Chapter 2 - Preliminaries to deep learning
MLIP - Chapter 2 - Preliminaries to deep learning
 
IVR - Chapter 4 - Variational methods
IVR - Chapter 4 - Variational methodsIVR - Chapter 4 - Variational methods
IVR - Chapter 4 - Variational methods
 
IVR - Chapter 5 - Bayesian methods
IVR - Chapter 5 - Bayesian methodsIVR - Chapter 5 - Bayesian methods
IVR - Chapter 5 - Bayesian methods
 
IVR - Chapter 7 - Patch models and dictionary learning
IVR - Chapter 7 - Patch models and dictionary learningIVR - Chapter 7 - Patch models and dictionary learning
IVR - Chapter 7 - Patch models and dictionary learning
 
IVR - Chapter 3 - Basics of filtering II: Spectral filters
IVR - Chapter 3 - Basics of filtering II: Spectral filtersIVR - Chapter 3 - Basics of filtering II: Spectral filters
IVR - Chapter 3 - Basics of filtering II: Spectral filters
 

Recently uploaded

Online blood donation management system project.pdf
Online blood donation management system project.pdfOnline blood donation management system project.pdf
Online blood donation management system project.pdfKamal Acharya
 
Top 13 Famous Civil Engineering Scientist
Top 13 Famous Civil Engineering ScientistTop 13 Famous Civil Engineering Scientist
Top 13 Famous Civil Engineering Scientistgettygaming1
 
CME397 Surface Engineering- Professional Elective
CME397 Surface Engineering- Professional ElectiveCME397 Surface Engineering- Professional Elective
CME397 Surface Engineering- Professional Electivekarthi keyan
 
ENERGY STORAGE DEVICES INTRODUCTION UNIT-I
ENERGY STORAGE DEVICES  INTRODUCTION UNIT-IENERGY STORAGE DEVICES  INTRODUCTION UNIT-I
ENERGY STORAGE DEVICES INTRODUCTION UNIT-IVigneshvaranMech
 
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...Dr.Costas Sachpazis
 
Natalia Rutkowska - BIM School Course in Kraków
Natalia Rutkowska - BIM School Course in KrakówNatalia Rutkowska - BIM School Course in Kraków
Natalia Rutkowska - BIM School Course in Krakówbim.edu.pl
 
fundamentals of drawing and isometric and orthographic projection
fundamentals of drawing and isometric and orthographic projectionfundamentals of drawing and isometric and orthographic projection
fundamentals of drawing and isometric and orthographic projectionjeevanprasad8
 
Toll tax management system project report..pdf
Toll tax management system project report..pdfToll tax management system project report..pdf
Toll tax management system project report..pdfKamal Acharya
 
The Benefits and Techniques of Trenchless Pipe Repair.pdf
The Benefits and Techniques of Trenchless Pipe Repair.pdfThe Benefits and Techniques of Trenchless Pipe Repair.pdf
The Benefits and Techniques of Trenchless Pipe Repair.pdfPipe Restoration Solutions
 
Scaling in conventional MOSFET for constant electric field and constant voltage
Scaling in conventional MOSFET for constant electric field and constant voltageScaling in conventional MOSFET for constant electric field and constant voltage
Scaling in conventional MOSFET for constant electric field and constant voltageRCC Institute of Information Technology
 
Immunizing Image Classifiers Against Localized Adversary Attacks
Immunizing Image Classifiers Against Localized Adversary AttacksImmunizing Image Classifiers Against Localized Adversary Attacks
Immunizing Image Classifiers Against Localized Adversary Attacksgerogepatton
 
Hall booking system project report .pdf
Hall booking system project report  .pdfHall booking system project report  .pdf
Hall booking system project report .pdfKamal Acharya
 
weather web application report.pdf
weather web application report.pdfweather web application report.pdf
weather web application report.pdfPratik Pawar
 
LIGA(E)11111111111111111111111111111111111111111.ppt
LIGA(E)11111111111111111111111111111111111111111.pptLIGA(E)11111111111111111111111111111111111111111.ppt
LIGA(E)11111111111111111111111111111111111111111.pptssuser9bd3ba
 
Cloud-Computing_CSE311_Computer-Networking CSE GUB BD - Shahidul.pptx
Cloud-Computing_CSE311_Computer-Networking CSE GUB BD - Shahidul.pptxCloud-Computing_CSE311_Computer-Networking CSE GUB BD - Shahidul.pptx
Cloud-Computing_CSE311_Computer-Networking CSE GUB BD - Shahidul.pptxMd. Shahidul Islam Prodhan
 
Halogenation process of chemical process industries
Halogenation process of chemical process industriesHalogenation process of chemical process industries
Halogenation process of chemical process industriesMuhammadTufail242431
 
HYDROPOWER - Hydroelectric power generation
HYDROPOWER - Hydroelectric power generationHYDROPOWER - Hydroelectric power generation
HYDROPOWER - Hydroelectric power generationRobbie Edward Sayers
 
Introduction to Machine Learning Unit-4 Notes for II-II Mechanical Engineering
Introduction to Machine Learning Unit-4 Notes for II-II Mechanical EngineeringIntroduction to Machine Learning Unit-4 Notes for II-II Mechanical Engineering
Introduction to Machine Learning Unit-4 Notes for II-II Mechanical EngineeringC Sai Kiran
 
2024 DevOps Pro Europe - Growing at the edge
2024 DevOps Pro Europe - Growing at the edge2024 DevOps Pro Europe - Growing at the edge
2024 DevOps Pro Europe - Growing at the edgePaco Orozco
 

Recently uploaded (20)

Online blood donation management system project.pdf
Online blood donation management system project.pdfOnline blood donation management system project.pdf
Online blood donation management system project.pdf
 
Top 13 Famous Civil Engineering Scientist
Top 13 Famous Civil Engineering ScientistTop 13 Famous Civil Engineering Scientist
Top 13 Famous Civil Engineering Scientist
 
CME397 Surface Engineering- Professional Elective
CME397 Surface Engineering- Professional ElectiveCME397 Surface Engineering- Professional Elective
CME397 Surface Engineering- Professional Elective
 
ENERGY STORAGE DEVICES INTRODUCTION UNIT-I
ENERGY STORAGE DEVICES  INTRODUCTION UNIT-IENERGY STORAGE DEVICES  INTRODUCTION UNIT-I
ENERGY STORAGE DEVICES INTRODUCTION UNIT-I
 
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
 
Natalia Rutkowska - BIM School Course in Kraków
Natalia Rutkowska - BIM School Course in KrakówNatalia Rutkowska - BIM School Course in Kraków
Natalia Rutkowska - BIM School Course in Kraków
 
fundamentals of drawing and isometric and orthographic projection
fundamentals of drawing and isometric and orthographic projectionfundamentals of drawing and isometric and orthographic projection
fundamentals of drawing and isometric and orthographic projection
 
Toll tax management system project report..pdf
Toll tax management system project report..pdfToll tax management system project report..pdf
Toll tax management system project report..pdf
 
Standard Reomte Control Interface - Neometrix
Standard Reomte Control Interface - NeometrixStandard Reomte Control Interface - Neometrix
Standard Reomte Control Interface - Neometrix
 
The Benefits and Techniques of Trenchless Pipe Repair.pdf
The Benefits and Techniques of Trenchless Pipe Repair.pdfThe Benefits and Techniques of Trenchless Pipe Repair.pdf
The Benefits and Techniques of Trenchless Pipe Repair.pdf
 
Scaling in conventional MOSFET for constant electric field and constant voltage
Scaling in conventional MOSFET for constant electric field and constant voltageScaling in conventional MOSFET for constant electric field and constant voltage
Scaling in conventional MOSFET for constant electric field and constant voltage
 
Immunizing Image Classifiers Against Localized Adversary Attacks
Immunizing Image Classifiers Against Localized Adversary AttacksImmunizing Image Classifiers Against Localized Adversary Attacks
Immunizing Image Classifiers Against Localized Adversary Attacks
 
Hall booking system project report .pdf
Hall booking system project report  .pdfHall booking system project report  .pdf
Hall booking system project report .pdf
 
weather web application report.pdf
weather web application report.pdfweather web application report.pdf
weather web application report.pdf
 
LIGA(E)11111111111111111111111111111111111111111.ppt
LIGA(E)11111111111111111111111111111111111111111.pptLIGA(E)11111111111111111111111111111111111111111.ppt
LIGA(E)11111111111111111111111111111111111111111.ppt
 
Cloud-Computing_CSE311_Computer-Networking CSE GUB BD - Shahidul.pptx
Cloud-Computing_CSE311_Computer-Networking CSE GUB BD - Shahidul.pptxCloud-Computing_CSE311_Computer-Networking CSE GUB BD - Shahidul.pptx
Cloud-Computing_CSE311_Computer-Networking CSE GUB BD - Shahidul.pptx
 
Halogenation process of chemical process industries
Halogenation process of chemical process industriesHalogenation process of chemical process industries
Halogenation process of chemical process industries
 
HYDROPOWER - Hydroelectric power generation
HYDROPOWER - Hydroelectric power generationHYDROPOWER - Hydroelectric power generation
HYDROPOWER - Hydroelectric power generation
 
Introduction to Machine Learning Unit-4 Notes for II-II Mechanical Engineering
Introduction to Machine Learning Unit-4 Notes for II-II Mechanical EngineeringIntroduction to Machine Learning Unit-4 Notes for II-II Mechanical Engineering
Introduction to Machine Learning Unit-4 Notes for II-II Mechanical Engineering
 
2024 DevOps Pro Europe - Growing at the edge
2024 DevOps Pro Europe - Growing at the edge2024 DevOps Pro Europe - Growing at the edge
2024 DevOps Pro Europe - Growing at the edge
 

IVR - Chapter 2 - Basics of filtering I: Spatial filters (25Mb)

  • 1. ECE 285 Image and video restoration Chapter II – Basics of filtering I Charles Deledalle March 11, 2019 1
  • 2. Basics of filtering Definition (Collins dictionary) filter, noun: any electronic, optical, or acoustic device that blocks signals or radiations of certain frequencies while allowing others to pass. 2
  • 3. Basics of filtering Definition (Collins dictionary) filter, noun: any electronic, optical, or acoustic device that blocks signals or radiations of certain frequencies while allowing others to pass. Refers to the direct model (observation/sensing filter) y = Hx • y: observed image • x: image of interest H is a linear filter, may act only on frequencies (e.g., blurs) or may not, but can only remove information (e.g., inpainting). (a) Unknown image x H −→ (b) Observation y 2
  • 4. Basics of filtering Definition (Oxford dictionary) filter, noun: a function used to alter the overall appearance of an image in a specific manner: ‘many other apps also offer filters for enhancing photos’ 3
  • 5. Basics of filtering Definition (Oxford dictionary) filter, noun: a function used to alter the overall appearance of an image in a specific manner: ‘many other apps also offer filters for enhancing photos’ Refers to the inversion model (restoration filter) ˆx = ψ(y) • y: observed image • ˆx: estimate of x ψ is a filter, linear or non-linear, that may act only on frequencies or may not, and usually attempts to add information. (a) Observation y ψ −→ (b) Estimate ˆx 3
  • 6. Basics of filtering Action of filters Perform punctual, local and/or global transformations of pixel values Punctual: New pixel value depends on the input one only e.g., change of contrast Local: New pixel value depends on the surrounding input pixels e.g., averaging/convolutions Global: New pixel value depends on the whole input image e.g., sigma filter 4
  • 7. Basics of filtering Filters • Often one of the first steps in a processing pipeline, • Goal: improve, simplify, denoise, deblur, detect objects... CVpipeline Source: Mike Thompson 5
  • 9. Basics of filtering Improve/denoise/detect Fibroblast cells and microbreads (fluorescence microscopy) Source: F. Luisier & C. Vonesch 7
  • 11. Basics of filtering Standard filters Two main approaches: • Spatial domain: use the pixel grid / spatial neighborhoods • Spectral domain: use Fourier transform, cosine transform, . . . Spatial Spectral 9
  • 13. Spatial filtering – Local filters Local / Neighboring filters • Combine/select values of y in the neighborhood Ni,j of pixel (i, j) • Following examples: moving average filters, derivative filters, median filters 10
  • 14. Spatial filtering – Moving average Moving average ˆxi,j = 1 Card(N) (k,l)∈Ni,j yk,l Examples: • Boxcar filter: Ni,j = {(k, l) ; |i − k| τ and |j − l| τ} • Diskcar filter: Ni,j = (k, l) ; |i − k|2 + |j − l|2 τ2 3 × 3 boxcar filter ˆxi,j = 1 9 i+1 k=i−1 j+1 l=j−1 yk,l Parameters: • Size: 3 × 3, 5 × 5, . . . • Shape: square, disk • Centered or not 11
  • 15. Spatial filtering – Moving average Moving average ˆxi,j = 1 Card(N) (k,l)∈Ni,j yk,l or ˆxi,j = 1 Card(N) (k,l)∈N yi+k,j+l N = N0,0Examples: • Boxcar filter: Ni,j = {(k, l) ; |i − k| τ and |j − l| τ} • Diskcar filter: Ni,j = (k, l) ; |i − k|2 + |j − l|2 τ2 3 × 3 boxcar filter ˆxi,j = 1 9 i+1 k=i−1 j+1 l=j−1 yk,l or ˆxi,j = 1 9 1 k=−1 +1 l=−1 yi+k,j+l Parameters: • Size: 3 × 3, 5 × 5, . . . • Shape: square, disk • Centered or not 11
  • 16. Spatial filtering – Moving average Moving weighted average ˆxi,j = (k,l)∈Z2 wk,lyi+k,j+l (k,l)∈Z2 wk,l • Neighboring filter: wi,j = 1 if (i, j) ∈ N 0 otherwise • Gaussian kernel: wi,j = exp −i2 +j2 2τ2 • Exponential kernel: wi,j = exp − √ i2+j2 τ 12
  • 17. Spatial filtering – Moving average • Rewrite ˆx as a function of s = (i, j), and let δ = (k, l) and t = s + δ ˆx(s) = δ∈Z2 w(δ)y(s + δ) δ∈Z2 w(δ) = t∈Z2 w(t − s)y(t) t∈Z2 w(t − s δ ) Local average filter • Weights are functions of the distance between t and s (length of δ) as w(t − s) = ϕ(length(t − s)) • ϕ : R+ → R: kernel function ( = convolution kernel) • Often, ϕ satisfies    • ϕ(0) = 1, • lim α→∞ ϕ(α) = 0, • ϕ non-increasing: α > β ⇒ ϕ(α) ϕ(β). 13
  • 18. Spatial filtering – Moving average Example • Box filter ϕ(α) = 1 if α τ 0 otherwise and length(δ) = ||δ||∞ • Disk filter ϕ(α) = 1 if α τ 0 otherwise and length(δ) = ||δ||2 • Gaussian filter ϕ(α) = exp − α2 2τ2 and length(δ) = ||δ||2 • Exponential filter ϕ(α) = exp − α τ and length(δ) = ||δ||2 Reminder: ||v||p = d k=1 vp k 1/p 14
  • 19. Spatial filtering – Moving average • ϕ often depends on (at least) one parameter τ • τ controls the amount of filtering • τ → 0: no filtering (output = input) • τ → ∞: average everything in the same proportion (output = constant signal) 15
  • 20. Spatial filtering – Moving average • ϕ often depends on (at least) one parameter τ • τ controls the amount of filtering • τ → 0: no filtering (output = input) • τ → ∞: average everything in the same proportion (output = constant signal) What would provide ϕ(α) = 1 if α τ 0 otherwise and length(δ) = ||δ||1? d: dimension (d = 2 for pictures, d = 3 for videos, . . . ) 15
  • 21. Spatial filtering – Moving average Boxfilter (a) τ = 1 (b) τ = 20 (c) τ = 40 (d) τ = 103 Diamondfilter (e) τ = 1 (f) τ = 20 (g) τ = 40 (h) τ = 103 Diskfilter (i) τ = 1 (j) τ = 20 (k) τ = 40 (l) τ = 103 16
  • 22. Spatial filtering How to express anisotropy? • ||e1||2 = 1 • ||e2||2 = 1 • e1, e2 = 0 length(δ) = 17
  • 23. Spatial filtering How to express anisotropy? • ||e1||2 = 1 • ||e2||2 = 1 • e1, e2 = 0 length(δ) = √ δT Σ−1δ where Σ = e1 e2 λ2 1 0 0 λ2 2 eT 1 eT 2 eigen-decomposition 17
  • 24. Spatial filtering How to express anisotropy? • ||e1||2 = 1 • ||e2||2 = 1 • e1, e2 = 0 length(δ) = √ δT Σ−1δ where Σ = e1 e2 λ2 1 0 0 λ2 2 eT 1 eT 2 eigen-decomposition = ||SRδ||2 where SR = λ−1 1 0 0 λ−1 2 S: scaling eT 1 eT 2 R: rotation 17
  • 25. Spatial filtering How to express anisotropy? • ||e1||2 = 1 • ||e2||2 = 1 • e1, e2 = 0 length(δ) = √ δT Σ−1δ where Σ = e1 e2 λ2 1 0 0 λ2 2 eT 1 eT 2 eigen-decomposition = ||SRδ||2 where SR = λ−1 1 0 0 λ−1 2 S: scaling eT 1 eT 2 R: rotation indeed, e1 = cos(θ) sin(θ) , e2 = − sin(θ) cos(θ) i.e. R = cos(θ) sin(θ) − sin(θ) cos(θ) rotation of −θ 17
  • 26. Spatial filtering – Moving average (a) y (b) ˆx, θ = 0° (c) θ = 26° (d) θ = 51° (e) θ = 77° (f) θ = 103° (g) θ = 129° (h) θ = 154° 18
  • 27. Spatial filtering – Moving average for denoising Moving average for denoising? Figure 1 – (left) Gaussian noise σ = 10. (right) Gaussian filter τ = 3. 19
  • 28. Spatial filtering – Moving average for denoising Moving average for denoising? Figure 1 – (left) Gaussian noise σ = 30. (right) Gaussian filter τ = 5. 19
  • 29. Spatial filtering – Moving average for denoising Input image Boxcar filter Gaussian filter • Boxcar: oscillations/artifacts in vertical and horizontal directions • Gaussian: no artifacts • Moving average: reduce noise , but loss of resolution, blurry aspect, remove edges 20
  • 30. Spatial filtering – Moving average for denoising Image blur ⇒ No more edges ⇒ Structure destruction ⇒ Reduction of image quality 21
  • 31. Spatial filtering – Moving average for denoising Image blur ⇒ No more edges ⇒ Structure destruction ⇒ Reduction of image quality What is an edge? 21
  • 32. Spatial filtering – Edges Edges? • Separation between objects, important parts of the image • Necessary for vision in order to reconstruct objects 22
  • 33. Spatial filtering – Edges Edge: More or less brutal change of intensity 23
  • 34. Spatial filtering – Edges • no edges ≡ no objects in the image • abrupt change ⇒ gap between intensities ⇒ large derivative 24
  • 35. Spatial filtering – Derivative filters How to detect edges? • Look at the derivative • How? Use derivative filters • What? Filters that behave somehow as the derivative of real functions How to design such filters? 25
  • 36. Spatial filtering – Derivative filters Derivative of 1d signals • Derivative of a function x : R → R, if exists, is: x (t) = lim h→0 x(t + h) − x(t) h or lim h→0 x(t) − x(t − h) h or lim h→0 x(t + h) − x(t − h) 2h equivalent definitions • For a 1d discrete signal, finite differences are xk = xk+1 − xk Forward xk = xk − xk−1 Backward xk = xk+1 − xk−1 2 Centered 26
  • 37. Spatial filtering – Derivative filters Derivative of 1d signals • Can be written as a filter xi = +1 k=−1 κkyi+k, with κ = (0, −1, 1) Forward κ = (−1, 1, 0) Backward κ = (−1 2 , 0, 1 2 ) Centered 27
  • 38. Spatial filtering – Derivative filters Derivative of 2d signals • Gradient of a function x : R2 → R, if exists, is: x =    ∂x ∂s1 ∂x ∂s2    with ∂x ∂s1 (s1, s2) = lim h→0 x(s1 + h, s2) − x(s1, s2) h ∂x ∂s2 (s1, s2) = lim h→0 x(s1, s2 + h) − x(s1, s2) h 28
  • 39. Spatial filtering – Derivative filters Derivative of 2d signals • Gradient for a 2d discrete signal: finite differences in each direction ( 1x)i,j = +1 k=−1 +1 l=−1 (κ1)k,lyi+k,j+l ( 2x)i,j = +1 k=−1 +1 l=−1 (κ2)k,lyi+k,j+l κ1 =    0 0 0 0 −1 0 0 1 0    κ2 =    0 0 0 0 −1 1 0 0 0    Forward κ1 =    0 −1 0 0 1 0 0 0 0    κ2 =    0 0 0 −1 1 0 0 0 0    Backward κ1 =    0 −1 2 0 0 0 0 0 1 2 0    κ2 =    0 0 0 −1 2 0 1 2 0 0 0    Centered 29
  • 40. Spatial filtering – Derivative filters Second order derivative of 1d signals • Second order derivative of a function x : R → R, if exists, is: x (t) = lim h→0 x(t − h) − 2x(t) + x(t + h) h2 • For a 1d discrete signal: xk = xk−1 − 2xk + xx+2 • Corresponding filter: h = (1, −2, 1) Laplacian of 2d signals • Laplacian of a function x : R2 → R, if exists, is: ∆x = ∂2 x ∂s2 1 + ∂2 x ∂s2 2 • For a 2d discrete signal: xi,j = xi−1,j + xi,j−1 − 4xi,j + xi+1,j + xi,j+1 • Corresponding filter: h =    0 1 0 1 −4 1 0 1 0    =    1 −2 1    + 1 −2 1 30
  • 41. Spatial filtering – Derivative filters (a) x (b) 1x (c) 2x (d) ∆x (e) x (f) 1x (g) 2x (h) ∆x Derivative filters detect edges but are sensitive to noise 31
  • 42. Spatial filtering – Derivative filters Other derivative filters • Roberts cross operator (1963) κ = +1 0 0 −1 and κ = 0 +1 −1 0 • Sobel operator (1968) κ1 =    −1 −2 −1 0 0 0 1 2 1    =    −1 0 1    1 2 1 and κ2 =    −1 0 1 −2 0 2 −1 0 1    =    1 2 1    −1 0 1 • Prewitt operator (1970) κ1 =    −1 −1 −1 0 0 0 1 1 1    =    −1 0 1    1 1 1 and κ2 =    −1 0 1 −1 0 1 −1 0 1    =    1 1 1    −1 0 1 32
  • 43. Spatial filtering – Derivative filters Edge detection Based on the norm (and angle) of the discrete approximation of the gradient || x||k = ( 1x)2 k + ( 2x)2 k and (∠ x)k = atan2(( 2x)k, ( 1x)k) 33
  • 44. Spatial filtering – Derivative filters (a) x (b) 1x (c) 2x (d) || x|| (e) x (f) 1x (Prewitt) (g) 2x (Prewitt) (h) || x|| (Sobel) Sobel & Prewitt: average in one direction, and differentiate in the other one ⇒ More robust to noise 34
  • 45. Spatial filtering – Averaging and derivative filters Comparison between averaging and derivative filters • Moving average ˆxi,j = (k,l)∈Z2 wk,lyi+k,j+l (k,l)∈Z2 wk,l = (k,l)∈Z2 wk,l (p,q)∈Z2 wp,q κk,l yi+k,j+l = (k,l)∈Z2 κk,lyi+k,j+l with (k,l)∈Z2 κk,l = 1 (preserve mean) • Derivative filter ˆxi,j = (k,l)∈Z2 κk,lyi+k,j+l with (k,l)∈Z2 κk,l = 0 (remove mean) • They share the same expression Do all filters have such an expression? 35
  • 46. Spatial filtering – Linear translation-invariant filters No, only linear translation-invariant (LTI) filters Let ψ satisfying 1 Linearity ψ(ax + by) = aψ(x) + bψ(y) 2 Translation-invariance ψ(yτ ) = ψ(y)τ where xτ (s) = x(s + τ) Then, there exist coefficients κk,l such that ψ(y)i,j = (k,l)∈Z2 κk,lyi+k,j+l The reciprocal holds true Note: Translation-invariant = Shift-invariant = Stationary = Same weighting applied everywhere = Identical behavior on identical structures, whatever their location 36
  • 47. Spatial filtering – Linear translation-invariant filters Linear translation-invariant filters ˆxi,j = ψ(y)i,j = (k,l)∈Z2 κk,lyi+k,j+l • Weighted average filters: κk,l = 1 Ex.: Box, Gaussian, Exponential, . . . • Derivative filters: κk,l = 0 Ex.: Laplacian, Sobel, Roberts, . . . 37
  • 48. Spatial filtering – Linear translation-invariant filters LTI filter ≡ Moving weighted sum ≡ Cross-correlation ≡ Convolution ˆxi,j = (k,l)∈Z2 κ∗ k,lyi+k,j+l = κ y (for κ complex) = (k,l)∈Z2 νk,lyi−k,j−l = ν ∗ y where νk,l = κ∗ −k,−l ν called convolution kernel (impulse response of the filter) 38
  • 49. Spatial filtering – LTI filters and convolution Properties of the convolution product • Linear f ∗ (αg + βh) = α(f ∗ g) + β(f ∗ h) • Commutative f ∗ g = g ∗ f • Associative f ∗ (g ∗ h) = (f ∗ g) ∗ h • Separable h = h1 ∗ h2 ∗ . . . ∗ hp ⇒ f ∗ h = (((f ∗ h1) ∗ h2) . . . ∗ hp) 39
  • 50. Spatial filtering – LTI filters and convolution • Directional separability of (isotrope) Gaussians: G2d τ = G1d horizontal τ ∗ G1d vertical τ 40
  • 51. Spatial filtering – LTI filters and convolution Directional separability of Gaussians. (y ∗ G2d τ )i,j = 1 Z ∞ k=−∞ ∞ l=−∞ exp − k2 + l2 2τ2 yi−k,j−l ≈ 1 Z q k=−q q l=−q exp − k2 + l2 2τ2 yi−k,j−l Restriction to a s × s window, s = 2q + 1 (Complexity O(s2 n1n2) ≈ 1 Z q k=−q exp − k2 2τ2 q l=−q exp − l2 2τ2 yi−k,j−l ∝(y∗G1d horizontal)i−k,j ∝(y∗G1d horizontal)∗G1d vertical (Complexity O( )) 41
  • 52. Spatial filtering – LTI filters and convolution Directional separability of Gaussians. (y ∗ G2d τ )i,j = 1 Z ∞ k=−∞ ∞ l=−∞ exp − k2 + l2 2τ2 yi−k,j−l ≈ 1 Z q k=−q q l=−q exp − k2 + l2 2τ2 yi−k,j−l Restriction to a s × s window, s = 2q + 1 (Complexity O(s2 n1n2) ≈ 1 Z q k=−q exp − k2 2τ2 q l=−q exp − l2 2τ2 yi−k,j−l ∝(y∗G1d horizontal)i−k,j ∝(y∗G1d horizontal)∗G1d vertical (Complexity O(sn1n2)) 41
  • 53. Spatial filtering – LTI filters and convolution • Multi-scale separability of Gaussians: (Continuous case) Gτ2 1 ∗ Gτ2 2 = Gτ2 1 +τ2 2 42
  • 54. Spatial filtering – LTI filters and convolution • Separability of Derivatives of Gaussian (DoG): (Continuous case) Gτ ∗ f = ∂Gτ ∂s ∗ f = Gτ ∗ ∂f ∂s 43
  • 55. Spatial filtering – LTI filters and convolution Separability of other LTI filters Directional sep. Multi-scale sep. Gaussian filter √ (↓ ∗ →) √ Exponential filter Box filter Disk filter Diamond filter Laplacian - Sobel - Prewitt - 44
  • 56. Spatial filtering – LTI filters and convolution Separability of other LTI filters Directional sep. Multi-scale sep. Gaussian filter √ (↓ ∗ →) √ Exponential filter x x Box filter √ (↓ ∗ →) x Disk filter x x Diamond filter x x Laplacian √ ( (↓ + →)) - Sobel √ (↓ ∗ →) - Prewitt √ (↓ ∗ →) - 44
  • 57. Spatial filtering – LTI and linear algebra LTI filters can be written as a matrix vector product Functional representation ˆx(s) = δ∈Z2 κ∗ (δ)y(s + δ) = δ∈Z2 ν(δ)y(s − δ) Vector representation ˆx = Hy with hi,j = κ∗ (sj − si) = ν(si − sj) • Vectors represent objects (here: images) • Matrices represent linear processings (here: convolution) 45
  • 58. Spatial filtering – LTI and linear algebra Proof in the periodical case. • Let δ = sj − si. Assuming periodical boundary conditions, we get ˆx(si) = n−1 j=0 κ∗ (sj − si)y(si + (sj − si)) = n−1 j=0 κ∗ (sj − si)y(sj) • Let hi,j = κ∗(sj − si), ˆxi = y(si) and yj = y(sj): ˆxi = n−1 j=0 hi,jyj • Define the matrix H = (hi,j), then ˆx = Hy. 46
  • 59. Spatial filtering – LTI and linear algebra What does H look like? 1d periodical case • In 1d, LTI filter stands for linear time invariant filters and reads ˆx(ti) = n−1 j=0 ν(ti − tj)y(tj) • Consider ti − tj = i − j, and let hi,j = ν(ti − tj) = νi−j[n]. • H is a circulant matrix given by H =              ν0 νn−1 νn−2 . . . ν2 ν1 ν1 ν0 νn−1 νn−2 . . . ν2 ... ... ... νn−1 νn−2 . . . ν2 ν1 ν0              47
  • 60. Spatial filtering – LTI and linear algebra What does H look like? 2d periodical case • In 2d, H is a doubly block circulant matrix given by Hx =                                                    First line Second line . . . Last line ν0,0 ν0,−1 ν0,1 ν−1,0 ν−1,−1 ν−1,1 ν1,0 ν1,−1 ν1,1 ν0,1 ν0,0 ν0,−1 ν−1,1 ν−1,0 ν−1,−1 ν1,1 ν1,0 ν1,−1 ... ... ... ... ... ... . . . ... ... ... ... ... ... ... ... ... ... ν0,−1 ν0,1 ν0,0 ν−1,−1 ν−1,1 ν−1,0 ν1,−1 ν1,1 ν1,0 ν1,0 ν1,−1 ν1,1 ν0,0 ν0,−1 ν0,1 ν2,0 ν2,−1 ν2,1 ν1,1 ν1,0 ν1,−1 ν0,1 ν0,0 ν0,−1 ν2,1 ν2,0 ν2,−1 ... ... ... ... ... ... . . . ... ... ... ... ... ... ... ... ... ... ν1,−1 ν1,1 ν1,0 ν0,−1 ν0,1 ν0,0 ν2,−1 ν2,1 ν2,0 ... ... ... ... ν−1,0 ν−1,−1 ν−1,1 ν−2,0 ν−2,−1 ν−2,1 ν0,0 ν0,−1 ν0,1 ν−1,1 ν−1,0 ν−1,−1 ν−2,1 ν−2,0 ν−2,−1 ν0,1 ν0,0 ν0,−1 ... ... ... ... ... ... . . . ... ... ... ... ... ... ... ... ... ... ν−1,−1 ν−1,1 ν−1,0 ν−2,−1 ν−2,1 ν−2,0 ν0,−1 ν0,1 ν0,0                                                                                                      x0,0 x0,1 ... ... x1,n2−1 x1,0 x1,1 ... ... x1,n2−1 ... xn1−1,0 xn1−1,1 ... ... xn1−1,n2−1                                                   48
  • 61. Spatial filtering – Properties of circulant matrices Properties of circulant matrices • Recall that the convolution is commutative: f ∗ g = g ∗ f ⇒ Idem for (doubly block) circulant matrices: H1H2 = H2H1 • Two matrices commute if they have the same eigenvectors ⇒ All circulant matrices share the same eigenvectors ⇒ LTI filters acts in the same eigenspace What eigenspace is that? 49
  • 62. Spatial filtering – Properties of circulant matrices Theorem • The n eigenvectors, with unit norm, of any circulant matrix H reads as ek = 1 √ n 1, exp 2πik n , exp 4πik n , . . . , exp 2(n − 1)πik n for k = 0 to n − 1. (Note: here i is the imaginary number) • Recall that the eigenvectors (ek) with unit norm must satisfy: Hek = λkek, e∗ kel = 0 if k = l and ||ek||2 = 1 Challenge: try to prove it yourself. The (ek) form a basis in which LTI filters modulates each element pointwise. This will be at the heart of Chapter 3. 50
  • 63. Spatial filtering – LTI filters – Limitations Limitations of LTI filters • Derivative filters: • Detect edges, but • Sensitive to noise • Moving average: • Decrease noise, but • Do not preserve edges Difficult object/background separation LTI filters cannot achieve a good trade-off in terms of noise vs edge separation 51
  • 64. Spatial filtering – LTI filters – Limitations Weak robustness against outliers Figure 2 – (left) Impulse noise. (center) Gaussian filter τ = 5. (right) τ = 11. • Even less efficient for impulse noise • For the best trade-off: structures are lost, noise remains • Do not adapt to the signal. Can we achieve better performance by designing an adaptive filter? 52
  • 66. Spatial filtering – Adaptive filtering Linear filter ⇒ Non-adaptive filter • Linear filters are non-adaptive • The operation does not depend on the signal Simple, fast implementation Introduce blur, do not preserve edges 53
  • 67. Spatial filtering – Adaptive filtering Linear filter ⇒ Non-adaptive filter • Linear filters are non-adaptive • The operation does not depend on the signal Simple, fast implementation Introduce blur, do not preserve edges Adaptive filter ⇒ Non-linear filter • Adapt the filtering to the content of the image • Operations/decisions depend on the values of y • Adaptive ⇒ non-linear: ψ(αx + βy) = αψ(x) + βψ(y) Since adapting to x or to y is not the same as adapting to αx + βy. 53
  • 68. Spatial filtering – Median filter Median filters • Try to denoise while respecting main structures ˆxi,j = median(y(i + k, j + l) | (k, l) ∈ N), N : neighborhood 54
  • 69. Spatial filtering – Median filter Behavior of median filters • Remove isolated points and thin structures • Preserve (staircase) edges and smooth corners 55
  • 70. Spatial filtering – Median filter Figure 3 – (left) Impulse noise. (center) 3 × 3 median filter. (right) 9 × 9. 56
  • 71. Spatial filtering – Median vs Gaussian Figure 4 – (left) Impulse noise. (center) 9 × 9 median filter. (right) Gaussian τ = 4. 57
  • 72. Spatial filtering – Median vs Gaussian Figure 5 – (left) Gaussian noise. (center) 5 × 5 median filter. (right) Gaussian τ = 3. 58
  • 73. Spatial filtering – Other standard non-linear filters Morphological operators • Erosion ˆxi,j = min(y(i + k, j + l) | (k, l) ∈ N) • Dilatation ˆxi,j = max(y(i + k, j + l) | (k, l) ∈ N) • N called structural element Figure 6 – (left) Salt-and-pepper noise, (center) Erosion, (right) Dilatation 59
  • 74. Spatial filtering – Morphological operators Figure 7 – (top) Closing, (bottom) Opening. (Source: J.Y. Gil & R. Kimmel) • Closing: dilatation and next erosion • Opening: erosion and next dilatation 60
  • 75. Spatial filtering – Global filtering Local filter • The operation depends only on the local neighborhood • ex: Gaussian filter, median filter Simple, fast implementation Do not preserve textures (global context) Global filter • Adapt the filtering to the global content of the image • Result at each pixel may depend on all other pixel values • Idea: Use non-linearity and global information 61
  • 76. Spatial filtering – Global average filters Local average filter ˆxi = n j=1 wi,jyj n j=1 wi,j with wi,j = ϕ(||si − sj||2 2) weights depend on the distance between pixel positions (linear) 62
  • 77. Spatial filtering – Global average filters Sigma filter [Lee, 1981] / Yaroslavsky filter [Yaroslavsky, 1985] ˆxi = n j=1 wi,jyj n j=1 wi,j with wi,j = ϕ(||yi − yj||2 2) weights depend on the distance between pixel values (non-linear) Sigma filter: ϕ(α) = 1 if α τ2 0 otherwise 63
  • 78. Spatial filtering – Sigma filter 50 100 150 200 250 0 50 100 150 200 250 50 100 150 200 250 0 50 100 150 200 250 Figure 8 – Selection of pixel candidates in the sigma filter 64
  • 79. Spatial filtering – Sigma filter (a) Noisy image σ = 10 (b) Sigma filter τ = 50 (c) τ = 100 (d) τ = 150 (e) τ = 200 (f) τ = 250 65
  • 80. Spatial filtering – Sigma filter Limitations of Sigma filter Respect edges Produce a loss of contrast: dull effect Do not reduce noise as much Equivalent to a change of histogram: • each value is mapped to another one • the mapping depends on the image (adaptive/non-linear filtering) 0 100 200 0 50 100 150 200 250 =0 = 50 = 100 = 150 = 200 = 250 Naive implementation: O(n2 ) Back to O(n) by using histograms Idea: apply the sigma filter on moving windows ≡ Mix moving average with sigma filter 66
  • 81. Spatial filtering – Bilateral filter Bilateral filter [Tomasi & Manduchi, 1998] ˆxi = n j=1 wi,jyj n j=1 wi,j with wi,j = ϕspace(||si − sj||2 2) × ϕcolor(||yi − yj||2 2) Weights depend on both the distance • between pixel positions, and • between pixel values. • Consider the influence of space and color, • Closer positions affects more the average, • Closer intensities affects more the average. 67
  • 82. Spatial filtering – Bilateral filter Properties • Generalization of moving averages and sigma filters. • ϕspace(·) = 1: sigma filter • ϕcolor(·) = 1: moving average • Spatial constraint: avoid dull effects • Color constraint: avoid blur effects 68
  • 83. Spatial filtering – Bilateral filter 50 100 150 200 250 0 50 100 150 200 250 50 100 150 200 250 0 50 100 150 200 250 Figure 9 – Selection of pixel candidates in the bilateral filter 69
  • 84. Spatial filtering – Bilateral filter (a) Noisy image σ = 10 (b) Bilateral filter τcolor = 5 (c) τcolor = 20 (d) τcolor = 40 (e) τcolor = 100 (f) τcolor = 200 ϕcolor(α) = exp − α 2τ2 color 70
  • 85. Spatial filtering – Bilateral filter (a) Noisy image σ = 10 (b) Bilateral filter τspace = 5 (c) τspace = 10 (d) τspace = 20 (e) τspace = 50 (f) τspace = ∞ ϕspace(α) = 1 if α τspace2 0 otherwise 71
  • 86. Spatial filtering – Bilateral vs moving average Figure 10 – (left) Gaussian noise. (center) Moving average. (right) Bilateral filter. Bilateral filter suppressed more noise while respecting the textures still remaining noises and dull effects 72
  • 87. Spatial filtering – Bilateral vs moving average 50 100 150 200 250 0 50 100 150 200 250 Why is there remaining noises? • Below average pixels are mixed with other below average pixels • Above average pixels are mixed with other above average pixels Why is there dull effects? • To counteract the remaining noise effect, τcolor should be large ⇒ different things get mixed up together What is missing? A more robust way to measure similarity, but similarity of what exactly? 73
  • 89. Spatial filtering – Looking for other views T noisy observations y(t) Estimation ˆx of the unknown signal x • Sample averaging of T noisy values: E[ˆxi] = E 1 T T t=1 y (t) i = 1 T T t=1 E[y (t) i ] = 1 T T t=1 xi = xi (unbiased) and Var[ˆxi] = Var 1 T T t=1 y (t) i = 1 T 2 T t=1 Var[y (t) i ] = 1 T 2 T t=1 σ 2 = σ2 T (reduce noise) • . . . only if the selected values are iid. similar = close to be iid → How can we select them on a single image? 74
  • 90. Spatial filtering – Selection-based filtering General idea • Goal: estimate the image x from the noisy image y • Choose a pixel i to denoise 75
  • 91. Spatial filtering – Selection-based filtering General idea • Goal: estimate the image x from the noisy image y • Choose a pixel i to denoise • Inspect the pixels j around the pixel of interest i • Select the suitable candidates j • Average their values and update the value of i 75
  • 92. Spatial filtering – Selection-based filtering General idea • Goal: estimate the image x from the noisy image y • Choose a pixel i to denoise • Inspect the pixels j around the pixel of interest i • Select the suitable candidates j • Average their values and update the value of i 75
  • 93. Spatial filtering – Selection-based filtering General idea • Goal: estimate the image x from the noisy image y • Choose a pixel i to denoise • Inspect the pixels j around the pixel of interest i • Select the suitable candidates j • Average their values and update the value of i 75
  • 94. Spatial filtering – Selection-based filtering General idea • Goal: estimate the image x from the noisy image y • Choose a pixel i to denoise • Inspect the pixels j around the pixel of interest i • Select the suitable candidates j • Average their values and update the value of i • Repeat for all pixels i 75
  • 95. Spatial filtering – Selection-based filtering General idea • Goal: estimate the image x from the noisy image y • Choose a pixel i to denoise • Inspect the pixels j around the pixel of interest i • Select the suitable candidates j • Average their values and update the value of i • Repeat for all pixels i 75
  • 96. Spatial filtering – Selection-based filtering General idea • Goal: estimate the image x from the noisy image y • Choose a pixel i to denoise • Inspect the pixels j around the pixel of interest i • Select the suitable candidates j • Average their values and update the value of i • Repeat for all pixels i 75
  • 97. Spatial filtering – Selection-based filtering Selection rules ˆxi = j wi,jyj j wi,j where wi,j = 1 if ||si − sj|| τ ← Moving average 0 otherwise wi,j = 1 if ||yi − yj|| τ ← Sigma filter 0 otherwise wi,j = 1 if xi = xj ← Oracle 0 otherwise How to choose suitable pixels j to combine? 75
  • 98. Spatial filtering – Patches Definition [Oxford dictionary] patch (noun): A small area or amount of something. Image patches: sub-regions of the image • shape: typically rectangular • size: much smaller than image size → most common use: square regions between 5 × 5 and 21 × 21 pixels → trade-off: size ⇒ more distinctive/informative size ⇒ easier to model/learn/match non-rectangular / deforming shapes: computationally complexity patches capture local context: geometry and texture 76
  • 99. Spatial filtering – Patches for texture synthesis Copying/pasting similar patches yields impressive texture synthesis: Texture synthesis method by Efros and Leung (1999) To generate a new pixel value: • extract the surrounding patch (yellow) • find similar patches in the reference image • randomly pick one of them • use the value of the central pixel of that patch 77
  • 100. Spatial filtering – Patches for texture synthesis Copying/pasting similar patches yields impressive texture synthesis: Texture synthesis method by Efros and Leung (1999) 5x5 window 11x11 window 15x15 window 23x23 window 77
  • 101. Spatial filtering – Non-local means Bilateral filter [Tomasi & Manduchi, 1998] ˆxi = j∈Ni wi,jyj j∈Ni wi,j with wi,j = ϕspace(||si − sj||2 2) × ϕcolor(||yi − yj||2 2) weights depend on the distance between pixel positions and pixel values Non-local means [Buades at al, 2005, Awate et al, 2005] ˆxi = j∈Ni wi,jyj n j∈Ni wi,j with wi,j = ϕ(||Piy − Pjy||2 2) • Ni: large neighborhood of i, called search window (typically 21 × 21) • Pi: operator extracting a small window, patch, at i (typically 7 × 7) weights in a large search windows depend on the distance between patches 78
  • 102. Spatial filtering – Non-local means Remarks The term non-local refers to that disconnected pixels are mixed together. The Sigma, Yaroslavsky and Bilateral filters are then also non-local. But Non-Local means (or NL-means) always refers to the one using patches. A similar algorithm was concurrently proposed under the name UINTA. NL-means [Buades et al, CVPR 2005] UINTA [Awate et al, CVPR 2005] 79
  • 103. Spatial filtering – Non-local means Non-local approach [Buades at al, 2005, Awate et al, 2005] • Local filters: average neighborhood pixels • Non-local filters: average pixels being in a similar context ˆxi = j wi,jyj j wi,j Similarity of noise−free values Similarityofnoise−freepatches 0 10 20 30 40 50 0 5 10 15 20 25 30 35 40 45 50 0 0.02 0.04 0.06 0.08 0.1 0.12 0.14 0.16 0.18 Patches are redundant in most types of images (large noise reduction) and similar ones tend to share the same underlying noise-free values (unbiasedness) 80
  • 104. Spatial filtering – Non-local means Non-local approach [Buades at al, 2005, Awate et al, 2005] • Local filters: average neighborhood pixels • Non-local filters: average pixels being in a similar context ˆxi = j wi,jyj j wi,j wi,j = e − ||si−sj ||2 2 2τ2 Weighted average Search window Weights map Noisy image Local approach 81
  • 105. Spatial filtering – Non-local means Non-local approach [Buades at al, 2005, Awate et al, 2005] • Local filters: average neighborhood pixels • Non-local filters: average pixels being in a similar context ˆxi = j wi,jyj j wi,j wi,j = e − ||si−sj ||2 2 2τ2 Weighted average Weighted average Search window Weights map Noisy image Local approach Weights map Non-Local approach Patch comparison Patch 1 Patch 2 Dissimilar patches Similar patches low weights high weights wi,j = e − ||Piy−Pj y||2 2 2τ2 81
  • 106. Spatial filtering – Non-local means Non-local approach [Buades at al, 2005, Awate et al, 2005] • Local filters: average neighborhood pixels • Non-local filters: average pixels being in a similar context ˆxi = j wi,jyj j wi,j wi,j = e − ||si−sj ||2 2 2τ2 Weighted average Weighted average Search window Weights map Noisy image Local approach Weights map Non-Local approach Patch comparison Patch 1 Patch 2 Dissimilar patches Similar patches low weights high weights wi,j = e − ||Piy−Pj y||2 2 2τ2 81
  • 107. Spatial filtering – Non-local means Example (Map of non-local weights) Figure 11 – Image extracted from [Buades et al., 2005] 82
  • 108. Spatial filtering – Non-local means Goodparameters (a) Noise-free image (b) Noisy image (c) Good parameters ToolowparametersToohighparameters (d) Search window size (e) Patch size (f) Bandwidth τ Figure 12 – Influence of the three main parameters of the NL means on the solution. 83
  • 109. Spatial filtering – Non-local means Limitations of NL-means Respect edges Remaining noise around rare patches Good for texture Lose/blur details with low SNR (a) Noisy image (b) NL-means (c) BM3D Naive implementation: O(n|N||P|) (∼ 1 minute for 256 × 256 image) Using integral tables: O(n|N|) (few seconds for 256 × 256 image) Or FFT: O(n|N| log |N|) 84
  • 110. Spatial filtering – Extensions of non-local means More elaborate schemes mostly rely on patches and use more sophisticated estimators than the average But we will need to study some more of the basics first... 85
  • 111. Questions? Next class: basics of filtering II Sources and images courtesy of • L. Condat • L. Denis • J.Y. Gil • H. Jiang • I. Kokkinos • R. Kimmel • F. Luisier • S. Seitz • M. Thompson • V.-T. Ta • C. Vonesh • Wikipedia 85