Some thoughts on Sampling
Don Sheehy
University of Connecticut
!
joint work with
Nick Cavanna and Kirk Gardner
Surface Reconstruction
Surface Reconstruction
Unknown surface X.
Finite sample P
Medial Axis L
We say P is an (adaptive) "-sample if for
all x 2 X, there is a p 2 P s.t. kx pk
fL(x)  ".
Surface Reconstruction
Unknown surface X.
Finite sample P
Medial Axis L
We say P is an (adaptive) "-sample if for
all x 2 X, there is a p 2 P s.t. kx pk
fL(x)  ".
Surface Reconstruction
Unknown surface X.
Finite sample P
Medial Axis L
Several known algorithms for producing
a homeomorphic reconstruction given
such a sample.
We say P is an (adaptive) "-sample if for
all x 2 X, there is a p 2 P s.t. kx pk
fL(x)  ".
Surface Reconstruction
Unknown surface X.
Finite sample P
Medial Axis L
Several known algorithms for producing
a homeomorphic reconstruction given
such a sample.
fL(x) := min
y2L
kx yk
local feature size
An Adaptive Metric
Idea: Let’s replace our variable radii balls with
equal radii balls in a different metric.
An Adaptive Metric
Idea: Let’s replace our variable radii balls with
equal radii balls in a different metric.
(t)
: [0, a] :! Rd
An Adaptive Metric
Idea: Let’s replace our variable radii balls with
equal radii balls in a different metric.
(t)
: [0, a] :! Rd
len( ) :=
Z a
0
k 0
(t)kL
dt
An Adaptive Metric
Idea: Let’s replace our variable radii balls with
equal radii balls in a different metric.
(t)
: [0, a] :! Rd
len( ) :=
Z a
0
k 0
(t)kL
dt
An Adaptive Metric
Idea: Let’s replace our variable radii balls with
equal radii balls in a different metric.
(t)
: [0, a] :! Rd
len( ) :=
Z a
0
k 0
(t)kL
dt
=
Z a
0
k 0
(t)k
fL( (t))
dt
An Adaptive Metric
Idea: Let’s replace our variable radii balls with
equal radii balls in a different metric.
(t)
: [0, a] :! Rd
len( ) :=
Z a
0
k 0
(t)kL
dt
=
Z a
0
k 0
(t)k
fL( (t))
dt
=
Z
dz
fL(z)
An Adaptive Metric
Idea: Let’s replace our variable radii balls with
equal radii balls in a different metric.
(t)
: [0, a] :! Rd
len( ) :=
Z a
0
k 0
(t)kL
dt
=
Z a
0
k 0
(t)k
fL( (t))
dt
=
Z
dz
fL(z)
dL
(x, y) := inf
2Path(x,y)
len( )
The metric induced by L.
An Adaptive Metric
Idea: Let’s replace our variable radii balls with
equal radii balls in a different metric.
(t)
: [0, a] :! Rd
len( ) :=
Z a
0
k 0
(t)kL
dt
=
Z a
0
k 0
(t)k
fL( (t))
dt
=
Z
dz
fL(z)
dL
(x, y) := inf
2Path(x,y)
len( )
The metric induced by L.
Coming up: Adaptive samples correspond to uniform samples
in the metric induced by L.
Niyogi-Smale-Weinberger
Homology Inference
of Submanifolds
Niyogi-Smale-Weinberger
Homology Inference
of Submanifolds
Niyogi-Smale-Weinberger
Homology Inference
of Submanifolds
Niyogi-Smale-Weinberger
Homology Inference
of Submanifolds
Take a union of balls.1
Niyogi-Smale-Weinberger
Homology Inference
of Submanifolds
Take a union of balls.1
Compute the homology of the
resulting Cech complex.2
Niyogi-Smale-Weinberger
Homology Inference
of Submanifolds
Take a union of balls.1
Compute the homology of the
resulting Cech complex.2
Niyogi-Smale-Weinberger
Homology Inference
of Submanifolds
Take a union of balls.1
Compute the homology of the
resulting Cech complex.2
0 Denoise the data.
Niyogi-Smale-Weinberger
Homology Inference
of Submanifolds
Take a union of balls.1
Compute the homology of the
resulting Cech complex.2
0 Denoise the data.
Niyogi-Smale-Weinberger
Homology Inference
of Submanifolds
Take a union of balls.1
Compute the homology of the
resulting Cech complex.2
0 Denoise the data.
Niyogi-Smale-Weinberger
Homology Inference
of Submanifolds
Take a union of balls.1
Compute the homology of the
resulting Cech complex.2
0 Denoise the data.
Niyogi-Smale-Weinberger
Homology Inference
of Submanifolds
Take a union of balls.1
Compute the homology of the
resulting Cech complex.2
0 Denoise the data.
Niyogi-Smale-Weinberger
Homology Inference
of Submanifolds
Take a union of balls.1
Compute the homology of the
resulting Cech complex.2
0 Denoise the data.
To prove: The density is bounded from below near M
and from above far from M. (Related to kNN density
estimator)
!
Explicitly construct homotopy equivalence between
balls and the submanifold (using the distance).
weak feature size and mu-reach
weak feature size and mu-reach
Key Technique: The Critical Point Theory of Distance Functions
!
Applies to compact subsets of Riemannian Manifolds
weak feature size and mu-reach
Key Technique: The Critical Point Theory of Distance Functions
!
Applies to compact subsets of Riemannian Manifolds
Weak Feature Size: Let L be the critical
points of the distance to X.
!
mu-reach: Let L be the points where the
gradient of the distance to X is at most mu.
weak feature size and mu-reach
Key Technique: The Critical Point Theory of Distance Functions
!
Applies to compact subsets of Riemannian Manifolds
Weak Feature Size: Let L be the critical
points of the distance to X.
!
mu-reach: Let L be the points where the
gradient of the distance to X is at most mu.
See Chazal, Lieutier ’05,’06,’09 and Chazal, Cohen-Steiner-Lieutier ’09.
Where does the metric come from?
Where does the metric come from?
fL(x) := min
y2L
kx yk fX(x) := min
y2X
kx yk
Distance to a set
Where does the metric come from?
fL(x) := min
y2L
kx yk fX(x) := min
y2X
kx yk
Distance to a set
fL
X(y) := min
x2X
dL
(x, y) = min
x2X
inf
2Path(x,y)
Z
dz
fL(z)
Distance to a set in the induced metric
Where does the metric come from?
fL(x) := min
y2L
kx yk fX(x) := min
y2X
kx yk
Distance to a set
fL
X(y) := min
x2X
dL
(x, y) = min
x2X
inf
2Path(x,y)
Z
dz
fL(z)
Distance to a set in the induced metric
cfL
X(y) := min
x2X
kx yk
fL(x)
Approximate distance to a set, using adaptivity in the Euclidean metric
Where does the metric come from?
fL(x) := min
y2L
kx yk fX(x) := min
y2X
kx yk
Distance to a set
fL
X(y) := min
x2X
dL
(x, y) = min
x2X
inf
2Path(x,y)
Z
dz
fL(z)
Distance to a set in the induced metric
cfL
X(y) := min
x2X
kx yk
fL(x)
Approximate distance to a set, using adaptivity in the Euclidean metric
Offsets Filtrations
AL
X (r) := {x 2 Rd
| fL
X(x)  r}
Where does the metric come from?
fL(x) := min
y2L
kx yk fX(x) := min
y2X
kx yk
Distance to a set
fL
X(y) := min
x2X
dL
(x, y) = min
x2X
inf
2Path(x,y)
Z
dz
fL(z)
Distance to a set in the induced metric
cfL
X(y) := min
x2X
kx yk
fL(x)
Approximate distance to a set, using adaptivity in the Euclidean metric
Offsets Filtrations
AL
X (r) := {x 2 Rd
| fL
X(x)  r}
BL
X(r) := {x 2 Rd
| cfL
X (x)  r}
Where does the metric come from?
fL(x) := min
y2L
kx yk fX(x) := min
y2X
kx yk
Distance to a set
fL
X(y) := min
x2X
dL
(x, y) = min
x2X
inf
2Path(x,y)
Z
dz
fL(z)
Distance to a set in the induced metric
cfL
X(y) := min
x2X
kx yk
fL(x)
Approximate distance to a set, using adaptivity in the Euclidean metric
Offsets Filtrations
AL
X (r) := {x 2 Rd
| fL
X(x)  r}
BL
X(r) := {x 2 Rd
| cfL
X (x)  r} =
[
x2X
ball(x, r fL(x))
Uniform Samples of Adaptive Metrics
Uniform Samples of Adaptive Metrics
dL
(x, Y ) := min
y2Y
dL
(x, y)
Uniform Samples of Adaptive Metrics
dL
(x, Y ) := min
y2Y
dL
(x, y)
dL
H(X, Y ) := max{max
x2X
dL
(x, Y ), max
y2Y
dL
(y, X)}
Uniform Samples of Adaptive Metrics
dL
(x, Y ) := min
y2Y
dL
(x, y)
dL
H(X, Y ) := max{max
x2X
dL
(x, Y ), max
y2Y
dL
(y, X)}
Uniform Sample:dL
H(P, X)  "
Uniform Samples of Adaptive Metrics
dL
(x, Y ) := min
y2Y
dL
(x, y)
dL
H(X, Y ) := max{max
x2X
dL
(x, Y ), max
y2Y
dL
(y, X)}
Uniform Sample:dL
H(P, X)  "
Theorem
Let L and X be compact sets, let P ⇢ X be a sample, and
let " 2 [0, 1) be a constant. If P is an "-sample of X with
respect to the distance to L, then dL
H(X, P)  "
1 " . Also,
if dL
H(X, P)  " < 1
2 , then P is an "
1 " -sample of X with
respect to the distance to L.
Interleaving Filtrations
Interleaving Filtrations
A pair of filtrations (F, G) is (h1, h2)-interleaved in (s, t) if
F(r) ✓ G(h1(r)) whenever r, h1(r) 2 (s, t) and
G(r) ✓ F(h2(r)) whenever r, h2(r) 2 (s, t).
(h1, h2 must be nondecreasing in (s, t)).
Interleaving Filtrations
A pair of filtrations (F, G) is (h1, h2)-interleaved in (s, t) if
F(r) ✓ G(h1(r)) whenever r, h1(r) 2 (s, t) and
G(r) ✓ F(h2(r)) whenever r, h2(r) 2 (s, t).
(h1, h2 must be nondecreasing in (s, t)).
If (F, G) is (h1, h2)-interleaved in (s1, t1), and (G, H) is
(h3, h4)-interleaved in (s2, t2), then (F, H) is (h3 h1, h2
h4)-interleaved in (s3, t3), where s3 = max{s1, s2} and
t3 = min{t1, t2}.
Lemma (Composing Interleavings)
Replacing X with a sample
Replacing X with a sample
Lemma 1
If dL
H( bX, X)  ", then (AL
X , AL
bX
) is (h1, h1)-interleaved in
(0, 1), where h1(r) = r + ".
Replacing X with a sample
Lemma 1
Follows from triangle inequality and
definition of Hausdorff distance.
If dL
H( bX, X)  ", then (AL
X , AL
bX
) is (h1, h1)-interleaved in
(0, 1), where h1(r) = r + ".
Using the Euclidean Metric
Using the Euclidean Metric
Lemma 2
The pair (AL
bX
, BL
bX
) are (h2, h2)-interleaved in (0, 1), where
h2(r) = r
1 r .
Using the Euclidean Metric
Lemma 2
The pair (AL
bX
, BL
bX
) are (h2, h2)-interleaved in (0, 1), where
h2(r) = r
1 r .
x
Key step is to prove the interleaving
for a single point.
Approximating L
Approximating L
Lemma 3
If d
bX
H (L, bL)  for some < 1, then (BL
bX
, B
bL
bX
) is (h3, h3)-
interleaved in (0, 1), where h3(r) = r
1 .
Approximating L
Lemma 3
If d
bX
H (L, bL)  for some < 1, then (BL
bX
, B
bL
bX
) is (h3, h3)-
interleaved in (0, 1), where h3(r) = r
1 .
Sampling condition for L is dual to that for X.
!
This shows that an “adaptive sample” of L suffices to give
the desired interleaving.
The Big Interleaving
The Big Interleaving
Theorem
Let L, bL ⇢ Rd
and X, bX ⇢ Rd
 (L [ bL) be compact sets.
If d
bX
H (L, bL)  < 1 and dL
H( bX, X)  " < 1, then (AL
X , B
bL
bX
)
are (h4, h5)-interleaved in (0, 1), where
h4(r) = r+"
(1 r ")(1 ) and h5(r) = r
1 r + ".
The Big Interleaving
Takeaway: Sufficient conditions on sampling X and L to
guarantee an interleaving.
Theorem
Let L, bL ⇢ Rd
and X, bX ⇢ Rd
 (L [ bL) be compact sets.
If d
bX
H (L, bL)  < 1 and dL
H( bX, X)  " < 1, then (AL
X , B
bL
bX
)
are (h4, h5)-interleaved in (0, 1), where
h4(r) = r+"
(1 r ")(1 ) and h5(r) = r
1 r + ".
kNN Sampling
kNN Sampling
Let P be a finite set of points in a bounding region B.
kNN Sampling
fP,k(x) := distance to kth nearest point of P
Let P be a finite set of points in a bounding region B.
kNN Sampling
fP,k(x) := distance to kth nearest point of P
Let P be a finite set of points in a bounding region B.
Goal: Compute a set M such that
M is ⌧-well-spaced, and for all x 2 B
↵fP,k(x)  fM,2(x)  fP,k(x)
for some constants ↵ and .
kNN Sampling
fP,k(x) := distance to kth nearest point of P
Let P be a finite set of points in a bounding region B.
Goal: Compute a set M such that
M is ⌧-well-spaced, and for all x 2 B
↵fP,k(x)  fM,2(x)  fP,k(x)
for some constants ↵ and .
Mesh Generation
Mesh Generation
Decompose a domain
into simple elements.
Mesh Generation
Decompose a domain
into simple elements.
Mesh Generation
Mesh Quality
Radius/Edge < const
Decompose a domain
into simple elements.
Mesh Generation
X X✓
Mesh Quality
Radius/Edge < const
Decompose a domain
into simple elements.
Mesh Generation
X X✓
Mesh Quality
Radius/Edge < const
Conforming to Input
Decompose a domain
into simple elements.
Mesh Generation
X X✓
Mesh Quality
Radius/Edge < const
Conforming to Input
Decompose a domain
into simple elements.
Mesh Generation
X X✓
Mesh Quality
Radius/Edge < const
Conforming to Input
Decompose a domain
into simple elements.
Mesh Generation
X X✓
Mesh Quality
Radius/Edge < const
Conforming to Input
Decompose a domain
into simple elements.
Mesh Generation
X X✓
Mesh Quality
Radius/Edge < const
Conforming to Input
Decompose a domain
into simple elements.
Voronoi Diagram
Mesh Generation
X X✓
Mesh Quality
Radius/Edge < const
OutRadius/InRadius < const
Conforming to Input
Decompose a domain
into simple elements.
Voronoi Diagram
Mesh Generation
X X✓
✓X
Mesh Quality
Radius/Edge < const
OutRadius/InRadius < const
Conforming to Input
Decompose a domain
into simple elements.
Voronoi Diagram
Mesh Generation
X X✓
✓X
Mesh Quality
Radius/Edge < const
OutRadius/InRadius < const
Conforming to Input
Decompose a domain
into simple elements.
Voronoi Diagram
Mesh Generation
X X✓
✓X
Mesh Quality
Radius/Edge < const
OutRadius/InRadius < const
Conforming to Input
Decompose a domain
into simple elements.
Voronoi Diagram
Meshing Point Sets
Input: P ⊂ Rd
Output: M ⊃ P with a “nice” Voronoi diagram
n = |P|, m = |M|
Meshing Point Sets
Input: P ⊂ Rd
Output: M ⊃ P with a “nice” Voronoi diagram
n = |P|, m = |M|
Meshing Point Sets
Input: P ⊂ Rd
Output: M ⊃ P with a “nice” Voronoi diagram
n = |P|, m = |M|
Meshing Point Sets
Input: P ⊂ Rd
Output: M ⊃ P with a “nice” Voronoi diagram
n = |P|, m = |M|
Meshing Point Sets
Aspect Ratio (quality):
Cell Sizing:
Constant Local Complexity:
Optimality and Running time:
v
Rv
rv
Rv
rv
≤ τ
|M| = Θ(|Optimal|)
Running time: O(n log n + |M|)
The degree of the 1-skeleton is 2O(d)
.
Rv = ⇥(fP,2(v))
kNN Sampling by Delaunay Refinement
maintain Voronoi/Delaunay incrementally
loop until no more points are added:
	 if some Voronoi cell V contains k or more points:
	 	 insert the farthest corner of V
!
	 if some Delaunay circumball C contains k or more points:
	 	 insert the center of C
!
	 while any Voronoi cell V has aspect ratio > tau:
	 	 insert the farthest corner of V
kNN Sampling by Delaunay Refinement
maintain Voronoi/Delaunay incrementally
loop until no more points are added:
	 if some Voronoi cell V contains k or more points:
	 	 insert the farthest corner of V
!
	 if some Delaunay circumball C contains k or more points:
	 	 insert the center of C
!
	 while any Voronoi cell V has aspect ratio > tau:
	 	 insert the farthest corner of V
Based on Sparse Voronoi Refinement [Hudson et al ’06]
kNN Sampling by Delaunay Refinement
Key to the analysis: If the aspect ratio is bounded, then any
sufficiently small ball that is not contained entirely in a Voronoi
cell is contained entirely in a Delaunay circumball.
Summary
There is a Riemannian metric induced by the
distance to a compact set that relates uniform
and adaptive samples. Interleaving!
!
kNN sampling can be done efficiently with a
simple variant of Delaunay refinement.
Summary
There is a Riemannian metric induced by the
distance to a compact set that relates uniform
and adaptive samples. Interleaving!
!
kNN sampling can be done efficiently with a
simple variant of Delaunay refinement.
Thanks.
Some Thoughts on Sampling

Some Thoughts on Sampling

  • 1.
    Some thoughts onSampling Don Sheehy University of Connecticut ! joint work with Nick Cavanna and Kirk Gardner
  • 2.
  • 3.
    Surface Reconstruction Unknown surfaceX. Finite sample P Medial Axis L
  • 4.
    We say Pis an (adaptive) "-sample if for all x 2 X, there is a p 2 P s.t. kx pk fL(x)  ". Surface Reconstruction Unknown surface X. Finite sample P Medial Axis L
  • 5.
    We say Pis an (adaptive) "-sample if for all x 2 X, there is a p 2 P s.t. kx pk fL(x)  ". Surface Reconstruction Unknown surface X. Finite sample P Medial Axis L Several known algorithms for producing a homeomorphic reconstruction given such a sample.
  • 6.
    We say Pis an (adaptive) "-sample if for all x 2 X, there is a p 2 P s.t. kx pk fL(x)  ". Surface Reconstruction Unknown surface X. Finite sample P Medial Axis L Several known algorithms for producing a homeomorphic reconstruction given such a sample. fL(x) := min y2L kx yk local feature size
  • 7.
    An Adaptive Metric Idea:Let’s replace our variable radii balls with equal radii balls in a different metric.
  • 8.
    An Adaptive Metric Idea:Let’s replace our variable radii balls with equal radii balls in a different metric. (t) : [0, a] :! Rd
  • 9.
    An Adaptive Metric Idea:Let’s replace our variable radii balls with equal radii balls in a different metric. (t) : [0, a] :! Rd len( ) := Z a 0 k 0 (t)kL dt
  • 10.
    An Adaptive Metric Idea:Let’s replace our variable radii balls with equal radii balls in a different metric. (t) : [0, a] :! Rd len( ) := Z a 0 k 0 (t)kL dt
  • 11.
    An Adaptive Metric Idea:Let’s replace our variable radii balls with equal radii balls in a different metric. (t) : [0, a] :! Rd len( ) := Z a 0 k 0 (t)kL dt = Z a 0 k 0 (t)k fL( (t)) dt
  • 12.
    An Adaptive Metric Idea:Let’s replace our variable radii balls with equal radii balls in a different metric. (t) : [0, a] :! Rd len( ) := Z a 0 k 0 (t)kL dt = Z a 0 k 0 (t)k fL( (t)) dt = Z dz fL(z)
  • 13.
    An Adaptive Metric Idea:Let’s replace our variable radii balls with equal radii balls in a different metric. (t) : [0, a] :! Rd len( ) := Z a 0 k 0 (t)kL dt = Z a 0 k 0 (t)k fL( (t)) dt = Z dz fL(z) dL (x, y) := inf 2Path(x,y) len( ) The metric induced by L.
  • 14.
    An Adaptive Metric Idea:Let’s replace our variable radii balls with equal radii balls in a different metric. (t) : [0, a] :! Rd len( ) := Z a 0 k 0 (t)kL dt = Z a 0 k 0 (t)k fL( (t)) dt = Z dz fL(z) dL (x, y) := inf 2Path(x,y) len( ) The metric induced by L. Coming up: Adaptive samples correspond to uniform samples in the metric induced by L.
  • 15.
  • 16.
  • 17.
  • 18.
  • 19.
    Niyogi-Smale-Weinberger Homology Inference of Submanifolds Takea union of balls.1 Compute the homology of the resulting Cech complex.2
  • 20.
    Niyogi-Smale-Weinberger Homology Inference of Submanifolds Takea union of balls.1 Compute the homology of the resulting Cech complex.2
  • 21.
    Niyogi-Smale-Weinberger Homology Inference of Submanifolds Takea union of balls.1 Compute the homology of the resulting Cech complex.2 0 Denoise the data.
  • 22.
    Niyogi-Smale-Weinberger Homology Inference of Submanifolds Takea union of balls.1 Compute the homology of the resulting Cech complex.2 0 Denoise the data.
  • 23.
    Niyogi-Smale-Weinberger Homology Inference of Submanifolds Takea union of balls.1 Compute the homology of the resulting Cech complex.2 0 Denoise the data.
  • 24.
    Niyogi-Smale-Weinberger Homology Inference of Submanifolds Takea union of balls.1 Compute the homology of the resulting Cech complex.2 0 Denoise the data.
  • 25.
    Niyogi-Smale-Weinberger Homology Inference of Submanifolds Takea union of balls.1 Compute the homology of the resulting Cech complex.2 0 Denoise the data.
  • 26.
    Niyogi-Smale-Weinberger Homology Inference of Submanifolds Takea union of balls.1 Compute the homology of the resulting Cech complex.2 0 Denoise the data. To prove: The density is bounded from below near M and from above far from M. (Related to kNN density estimator) ! Explicitly construct homotopy equivalence between balls and the submanifold (using the distance).
  • 27.
    weak feature sizeand mu-reach
  • 28.
    weak feature sizeand mu-reach Key Technique: The Critical Point Theory of Distance Functions ! Applies to compact subsets of Riemannian Manifolds
  • 29.
    weak feature sizeand mu-reach Key Technique: The Critical Point Theory of Distance Functions ! Applies to compact subsets of Riemannian Manifolds Weak Feature Size: Let L be the critical points of the distance to X. ! mu-reach: Let L be the points where the gradient of the distance to X is at most mu.
  • 30.
    weak feature sizeand mu-reach Key Technique: The Critical Point Theory of Distance Functions ! Applies to compact subsets of Riemannian Manifolds Weak Feature Size: Let L be the critical points of the distance to X. ! mu-reach: Let L be the points where the gradient of the distance to X is at most mu. See Chazal, Lieutier ’05,’06,’09 and Chazal, Cohen-Steiner-Lieutier ’09.
  • 31.
    Where does themetric come from?
  • 32.
    Where does themetric come from? fL(x) := min y2L kx yk fX(x) := min y2X kx yk Distance to a set
  • 33.
    Where does themetric come from? fL(x) := min y2L kx yk fX(x) := min y2X kx yk Distance to a set fL X(y) := min x2X dL (x, y) = min x2X inf 2Path(x,y) Z dz fL(z) Distance to a set in the induced metric
  • 34.
    Where does themetric come from? fL(x) := min y2L kx yk fX(x) := min y2X kx yk Distance to a set fL X(y) := min x2X dL (x, y) = min x2X inf 2Path(x,y) Z dz fL(z) Distance to a set in the induced metric cfL X(y) := min x2X kx yk fL(x) Approximate distance to a set, using adaptivity in the Euclidean metric
  • 35.
    Where does themetric come from? fL(x) := min y2L kx yk fX(x) := min y2X kx yk Distance to a set fL X(y) := min x2X dL (x, y) = min x2X inf 2Path(x,y) Z dz fL(z) Distance to a set in the induced metric cfL X(y) := min x2X kx yk fL(x) Approximate distance to a set, using adaptivity in the Euclidean metric Offsets Filtrations AL X (r) := {x 2 Rd | fL X(x)  r}
  • 36.
    Where does themetric come from? fL(x) := min y2L kx yk fX(x) := min y2X kx yk Distance to a set fL X(y) := min x2X dL (x, y) = min x2X inf 2Path(x,y) Z dz fL(z) Distance to a set in the induced metric cfL X(y) := min x2X kx yk fL(x) Approximate distance to a set, using adaptivity in the Euclidean metric Offsets Filtrations AL X (r) := {x 2 Rd | fL X(x)  r} BL X(r) := {x 2 Rd | cfL X (x)  r}
  • 37.
    Where does themetric come from? fL(x) := min y2L kx yk fX(x) := min y2X kx yk Distance to a set fL X(y) := min x2X dL (x, y) = min x2X inf 2Path(x,y) Z dz fL(z) Distance to a set in the induced metric cfL X(y) := min x2X kx yk fL(x) Approximate distance to a set, using adaptivity in the Euclidean metric Offsets Filtrations AL X (r) := {x 2 Rd | fL X(x)  r} BL X(r) := {x 2 Rd | cfL X (x)  r} = [ x2X ball(x, r fL(x))
  • 38.
    Uniform Samples ofAdaptive Metrics
  • 39.
    Uniform Samples ofAdaptive Metrics dL (x, Y ) := min y2Y dL (x, y)
  • 40.
    Uniform Samples ofAdaptive Metrics dL (x, Y ) := min y2Y dL (x, y) dL H(X, Y ) := max{max x2X dL (x, Y ), max y2Y dL (y, X)}
  • 41.
    Uniform Samples ofAdaptive Metrics dL (x, Y ) := min y2Y dL (x, y) dL H(X, Y ) := max{max x2X dL (x, Y ), max y2Y dL (y, X)} Uniform Sample:dL H(P, X)  "
  • 42.
    Uniform Samples ofAdaptive Metrics dL (x, Y ) := min y2Y dL (x, y) dL H(X, Y ) := max{max x2X dL (x, Y ), max y2Y dL (y, X)} Uniform Sample:dL H(P, X)  " Theorem Let L and X be compact sets, let P ⇢ X be a sample, and let " 2 [0, 1) be a constant. If P is an "-sample of X with respect to the distance to L, then dL H(X, P)  " 1 " . Also, if dL H(X, P)  " < 1 2 , then P is an " 1 " -sample of X with respect to the distance to L.
  • 43.
  • 44.
    Interleaving Filtrations A pairof filtrations (F, G) is (h1, h2)-interleaved in (s, t) if F(r) ✓ G(h1(r)) whenever r, h1(r) 2 (s, t) and G(r) ✓ F(h2(r)) whenever r, h2(r) 2 (s, t). (h1, h2 must be nondecreasing in (s, t)).
  • 45.
    Interleaving Filtrations A pairof filtrations (F, G) is (h1, h2)-interleaved in (s, t) if F(r) ✓ G(h1(r)) whenever r, h1(r) 2 (s, t) and G(r) ✓ F(h2(r)) whenever r, h2(r) 2 (s, t). (h1, h2 must be nondecreasing in (s, t)). If (F, G) is (h1, h2)-interleaved in (s1, t1), and (G, H) is (h3, h4)-interleaved in (s2, t2), then (F, H) is (h3 h1, h2 h4)-interleaved in (s3, t3), where s3 = max{s1, s2} and t3 = min{t1, t2}. Lemma (Composing Interleavings)
  • 46.
  • 47.
    Replacing X witha sample Lemma 1 If dL H( bX, X)  ", then (AL X , AL bX ) is (h1, h1)-interleaved in (0, 1), where h1(r) = r + ".
  • 48.
    Replacing X witha sample Lemma 1 Follows from triangle inequality and definition of Hausdorff distance. If dL H( bX, X)  ", then (AL X , AL bX ) is (h1, h1)-interleaved in (0, 1), where h1(r) = r + ".
  • 49.
  • 50.
    Using the EuclideanMetric Lemma 2 The pair (AL bX , BL bX ) are (h2, h2)-interleaved in (0, 1), where h2(r) = r 1 r .
  • 51.
    Using the EuclideanMetric Lemma 2 The pair (AL bX , BL bX ) are (h2, h2)-interleaved in (0, 1), where h2(r) = r 1 r . x Key step is to prove the interleaving for a single point.
  • 52.
  • 53.
    Approximating L Lemma 3 Ifd bX H (L, bL)  for some < 1, then (BL bX , B bL bX ) is (h3, h3)- interleaved in (0, 1), where h3(r) = r 1 .
  • 54.
    Approximating L Lemma 3 Ifd bX H (L, bL)  for some < 1, then (BL bX , B bL bX ) is (h3, h3)- interleaved in (0, 1), where h3(r) = r 1 . Sampling condition for L is dual to that for X. ! This shows that an “adaptive sample” of L suffices to give the desired interleaving.
  • 55.
  • 56.
    The Big Interleaving Theorem LetL, bL ⇢ Rd and X, bX ⇢ Rd (L [ bL) be compact sets. If d bX H (L, bL)  < 1 and dL H( bX, X)  " < 1, then (AL X , B bL bX ) are (h4, h5)-interleaved in (0, 1), where h4(r) = r+" (1 r ")(1 ) and h5(r) = r 1 r + ".
  • 57.
    The Big Interleaving Takeaway:Sufficient conditions on sampling X and L to guarantee an interleaving. Theorem Let L, bL ⇢ Rd and X, bX ⇢ Rd (L [ bL) be compact sets. If d bX H (L, bL)  < 1 and dL H( bX, X)  " < 1, then (AL X , B bL bX ) are (h4, h5)-interleaved in (0, 1), where h4(r) = r+" (1 r ")(1 ) and h5(r) = r 1 r + ".
  • 58.
  • 59.
    kNN Sampling Let Pbe a finite set of points in a bounding region B.
  • 60.
    kNN Sampling fP,k(x) :=distance to kth nearest point of P Let P be a finite set of points in a bounding region B.
  • 61.
    kNN Sampling fP,k(x) :=distance to kth nearest point of P Let P be a finite set of points in a bounding region B. Goal: Compute a set M such that M is ⌧-well-spaced, and for all x 2 B ↵fP,k(x)  fM,2(x)  fP,k(x) for some constants ↵ and .
  • 62.
    kNN Sampling fP,k(x) :=distance to kth nearest point of P Let P be a finite set of points in a bounding region B. Goal: Compute a set M such that M is ⌧-well-spaced, and for all x 2 B ↵fP,k(x)  fM,2(x)  fP,k(x) for some constants ↵ and .
  • 63.
  • 64.
    Mesh Generation Decompose adomain into simple elements.
  • 65.
    Mesh Generation Decompose adomain into simple elements.
  • 66.
    Mesh Generation Mesh Quality Radius/Edge< const Decompose a domain into simple elements.
  • 67.
    Mesh Generation X X✓ MeshQuality Radius/Edge < const Decompose a domain into simple elements.
  • 68.
    Mesh Generation X X✓ MeshQuality Radius/Edge < const Conforming to Input Decompose a domain into simple elements.
  • 69.
    Mesh Generation X X✓ MeshQuality Radius/Edge < const Conforming to Input Decompose a domain into simple elements.
  • 70.
    Mesh Generation X X✓ MeshQuality Radius/Edge < const Conforming to Input Decompose a domain into simple elements.
  • 71.
    Mesh Generation X X✓ MeshQuality Radius/Edge < const Conforming to Input Decompose a domain into simple elements.
  • 72.
    Mesh Generation X X✓ MeshQuality Radius/Edge < const Conforming to Input Decompose a domain into simple elements. Voronoi Diagram
  • 73.
    Mesh Generation X X✓ MeshQuality Radius/Edge < const OutRadius/InRadius < const Conforming to Input Decompose a domain into simple elements. Voronoi Diagram
  • 74.
    Mesh Generation X X✓ ✓X MeshQuality Radius/Edge < const OutRadius/InRadius < const Conforming to Input Decompose a domain into simple elements. Voronoi Diagram
  • 75.
    Mesh Generation X X✓ ✓X MeshQuality Radius/Edge < const OutRadius/InRadius < const Conforming to Input Decompose a domain into simple elements. Voronoi Diagram
  • 76.
    Mesh Generation X X✓ ✓X MeshQuality Radius/Edge < const OutRadius/InRadius < const Conforming to Input Decompose a domain into simple elements. Voronoi Diagram
  • 77.
    Meshing Point Sets Input:P ⊂ Rd Output: M ⊃ P with a “nice” Voronoi diagram n = |P|, m = |M|
  • 78.
    Meshing Point Sets Input:P ⊂ Rd Output: M ⊃ P with a “nice” Voronoi diagram n = |P|, m = |M|
  • 79.
    Meshing Point Sets Input:P ⊂ Rd Output: M ⊃ P with a “nice” Voronoi diagram n = |P|, m = |M|
  • 80.
    Meshing Point Sets Input:P ⊂ Rd Output: M ⊃ P with a “nice” Voronoi diagram n = |P|, m = |M|
  • 81.
    Meshing Point Sets AspectRatio (quality): Cell Sizing: Constant Local Complexity: Optimality and Running time: v Rv rv Rv rv ≤ τ |M| = Θ(|Optimal|) Running time: O(n log n + |M|) The degree of the 1-skeleton is 2O(d) . Rv = ⇥(fP,2(v))
  • 82.
    kNN Sampling byDelaunay Refinement maintain Voronoi/Delaunay incrementally loop until no more points are added: if some Voronoi cell V contains k or more points: insert the farthest corner of V ! if some Delaunay circumball C contains k or more points: insert the center of C ! while any Voronoi cell V has aspect ratio > tau: insert the farthest corner of V
  • 83.
    kNN Sampling byDelaunay Refinement maintain Voronoi/Delaunay incrementally loop until no more points are added: if some Voronoi cell V contains k or more points: insert the farthest corner of V ! if some Delaunay circumball C contains k or more points: insert the center of C ! while any Voronoi cell V has aspect ratio > tau: insert the farthest corner of V Based on Sparse Voronoi Refinement [Hudson et al ’06]
  • 84.
    kNN Sampling byDelaunay Refinement Key to the analysis: If the aspect ratio is bounded, then any sufficiently small ball that is not contained entirely in a Voronoi cell is contained entirely in a Delaunay circumball.
  • 85.
    Summary There is aRiemannian metric induced by the distance to a compact set that relates uniform and adaptive samples. Interleaving! ! kNN sampling can be done efficiently with a simple variant of Delaunay refinement.
  • 86.
    Summary There is aRiemannian metric induced by the distance to a compact set that relates uniform and adaptive samples. Interleaving! ! kNN sampling can be done efficiently with a simple variant of Delaunay refinement. Thanks.