In this talk, I address two new ideas in sampling geometric objects. The first is a new take on adaptive sampling with respect to the local feature size, i.e., the distance to the medial axis. We recently proved that such samples acn be viewed as uniform samples with respect to an alternative metric on the Euclidean space. The second is a generalization of Voronoi refinement sampling. There, one also achieves an adaptive sample while simultaneously "discovering" the underlying sizing function. We show how to construct such samples that are spaced uniformly with respect to the $k$th nearest neighbor distance function.
4. We say P is an (adaptive) "-sample if for
all x 2 X, there is a p 2 P s.t. kx pk
fL(x) ".
Surface Reconstruction
Unknown surface X.
Finite sample P
Medial Axis L
5. We say P is an (adaptive) "-sample if for
all x 2 X, there is a p 2 P s.t. kx pk
fL(x) ".
Surface Reconstruction
Unknown surface X.
Finite sample P
Medial Axis L
Several known algorithms for producing
a homeomorphic reconstruction given
such a sample.
6. We say P is an (adaptive) "-sample if for
all x 2 X, there is a p 2 P s.t. kx pk
fL(x) ".
Surface Reconstruction
Unknown surface X.
Finite sample P
Medial Axis L
Several known algorithms for producing
a homeomorphic reconstruction given
such a sample.
fL(x) := min
y2L
kx yk
local feature size
7. An Adaptive Metric
Idea: Let’s replace our variable radii balls with
equal radii balls in a different metric.
8. An Adaptive Metric
Idea: Let’s replace our variable radii balls with
equal radii balls in a different metric.
(t)
: [0, a] :! Rd
9. An Adaptive Metric
Idea: Let’s replace our variable radii balls with
equal radii balls in a different metric.
(t)
: [0, a] :! Rd
len( ) :=
Z a
0
k 0
(t)kL
dt
10. An Adaptive Metric
Idea: Let’s replace our variable radii balls with
equal radii balls in a different metric.
(t)
: [0, a] :! Rd
len( ) :=
Z a
0
k 0
(t)kL
dt
11. An Adaptive Metric
Idea: Let’s replace our variable radii balls with
equal radii balls in a different metric.
(t)
: [0, a] :! Rd
len( ) :=
Z a
0
k 0
(t)kL
dt
=
Z a
0
k 0
(t)k
fL( (t))
dt
12. An Adaptive Metric
Idea: Let’s replace our variable radii balls with
equal radii balls in a different metric.
(t)
: [0, a] :! Rd
len( ) :=
Z a
0
k 0
(t)kL
dt
=
Z a
0
k 0
(t)k
fL( (t))
dt
=
Z
dz
fL(z)
13. An Adaptive Metric
Idea: Let’s replace our variable radii balls with
equal radii balls in a different metric.
(t)
: [0, a] :! Rd
len( ) :=
Z a
0
k 0
(t)kL
dt
=
Z a
0
k 0
(t)k
fL( (t))
dt
=
Z
dz
fL(z)
dL
(x, y) := inf
2Path(x,y)
len( )
The metric induced by L.
14. An Adaptive Metric
Idea: Let’s replace our variable radii balls with
equal radii balls in a different metric.
(t)
: [0, a] :! Rd
len( ) :=
Z a
0
k 0
(t)kL
dt
=
Z a
0
k 0
(t)k
fL( (t))
dt
=
Z
dz
fL(z)
dL
(x, y) := inf
2Path(x,y)
len( )
The metric induced by L.
Coming up: Adaptive samples correspond to uniform samples
in the metric induced by L.
26. Niyogi-Smale-Weinberger
Homology Inference
of Submanifolds
Take a union of balls.1
Compute the homology of the
resulting Cech complex.2
0 Denoise the data.
To prove: The density is bounded from below near M
and from above far from M. (Related to kNN density
estimator)
!
Explicitly construct homotopy equivalence between
balls and the submanifold (using the distance).
28. weak feature size and mu-reach
Key Technique: The Critical Point Theory of Distance Functions
!
Applies to compact subsets of Riemannian Manifolds
29. weak feature size and mu-reach
Key Technique: The Critical Point Theory of Distance Functions
!
Applies to compact subsets of Riemannian Manifolds
Weak Feature Size: Let L be the critical
points of the distance to X.
!
mu-reach: Let L be the points where the
gradient of the distance to X is at most mu.
30. weak feature size and mu-reach
Key Technique: The Critical Point Theory of Distance Functions
!
Applies to compact subsets of Riemannian Manifolds
Weak Feature Size: Let L be the critical
points of the distance to X.
!
mu-reach: Let L be the points where the
gradient of the distance to X is at most mu.
See Chazal, Lieutier ’05,’06,’09 and Chazal, Cohen-Steiner-Lieutier ’09.
32. Where does the metric come from?
fL(x) := min
y2L
kx yk fX(x) := min
y2X
kx yk
Distance to a set
33. Where does the metric come from?
fL(x) := min
y2L
kx yk fX(x) := min
y2X
kx yk
Distance to a set
fL
X(y) := min
x2X
dL
(x, y) = min
x2X
inf
2Path(x,y)
Z
dz
fL(z)
Distance to a set in the induced metric
34. Where does the metric come from?
fL(x) := min
y2L
kx yk fX(x) := min
y2X
kx yk
Distance to a set
fL
X(y) := min
x2X
dL
(x, y) = min
x2X
inf
2Path(x,y)
Z
dz
fL(z)
Distance to a set in the induced metric
cfL
X(y) := min
x2X
kx yk
fL(x)
Approximate distance to a set, using adaptivity in the Euclidean metric
35. Where does the metric come from?
fL(x) := min
y2L
kx yk fX(x) := min
y2X
kx yk
Distance to a set
fL
X(y) := min
x2X
dL
(x, y) = min
x2X
inf
2Path(x,y)
Z
dz
fL(z)
Distance to a set in the induced metric
cfL
X(y) := min
x2X
kx yk
fL(x)
Approximate distance to a set, using adaptivity in the Euclidean metric
Offsets Filtrations
AL
X (r) := {x 2 Rd
| fL
X(x) r}
36. Where does the metric come from?
fL(x) := min
y2L
kx yk fX(x) := min
y2X
kx yk
Distance to a set
fL
X(y) := min
x2X
dL
(x, y) = min
x2X
inf
2Path(x,y)
Z
dz
fL(z)
Distance to a set in the induced metric
cfL
X(y) := min
x2X
kx yk
fL(x)
Approximate distance to a set, using adaptivity in the Euclidean metric
Offsets Filtrations
AL
X (r) := {x 2 Rd
| fL
X(x) r}
BL
X(r) := {x 2 Rd
| cfL
X (x) r}
37. Where does the metric come from?
fL(x) := min
y2L
kx yk fX(x) := min
y2X
kx yk
Distance to a set
fL
X(y) := min
x2X
dL
(x, y) = min
x2X
inf
2Path(x,y)
Z
dz
fL(z)
Distance to a set in the induced metric
cfL
X(y) := min
x2X
kx yk
fL(x)
Approximate distance to a set, using adaptivity in the Euclidean metric
Offsets Filtrations
AL
X (r) := {x 2 Rd
| fL
X(x) r}
BL
X(r) := {x 2 Rd
| cfL
X (x) r} =
[
x2X
ball(x, r fL(x))
40. Uniform Samples of Adaptive Metrics
dL
(x, Y ) := min
y2Y
dL
(x, y)
dL
H(X, Y ) := max{max
x2X
dL
(x, Y ), max
y2Y
dL
(y, X)}
41. Uniform Samples of Adaptive Metrics
dL
(x, Y ) := min
y2Y
dL
(x, y)
dL
H(X, Y ) := max{max
x2X
dL
(x, Y ), max
y2Y
dL
(y, X)}
Uniform Sample:dL
H(P, X) "
42. Uniform Samples of Adaptive Metrics
dL
(x, Y ) := min
y2Y
dL
(x, y)
dL
H(X, Y ) := max{max
x2X
dL
(x, Y ), max
y2Y
dL
(y, X)}
Uniform Sample:dL
H(P, X) "
Theorem
Let L and X be compact sets, let P ⇢ X be a sample, and
let " 2 [0, 1) be a constant. If P is an "-sample of X with
respect to the distance to L, then dL
H(X, P) "
1 " . Also,
if dL
H(X, P) " < 1
2 , then P is an "
1 " -sample of X with
respect to the distance to L.
44. Interleaving Filtrations
A pair of filtrations (F, G) is (h1, h2)-interleaved in (s, t) if
F(r) ✓ G(h1(r)) whenever r, h1(r) 2 (s, t) and
G(r) ✓ F(h2(r)) whenever r, h2(r) 2 (s, t).
(h1, h2 must be nondecreasing in (s, t)).
45. Interleaving Filtrations
A pair of filtrations (F, G) is (h1, h2)-interleaved in (s, t) if
F(r) ✓ G(h1(r)) whenever r, h1(r) 2 (s, t) and
G(r) ✓ F(h2(r)) whenever r, h2(r) 2 (s, t).
(h1, h2 must be nondecreasing in (s, t)).
If (F, G) is (h1, h2)-interleaved in (s1, t1), and (G, H) is
(h3, h4)-interleaved in (s2, t2), then (F, H) is (h3 h1, h2
h4)-interleaved in (s3, t3), where s3 = max{s1, s2} and
t3 = min{t1, t2}.
Lemma (Composing Interleavings)
47. Replacing X with a sample
Lemma 1
If dL
H( bX, X) ", then (AL
X , AL
bX
) is (h1, h1)-interleaved in
(0, 1), where h1(r) = r + ".
48. Replacing X with a sample
Lemma 1
Follows from triangle inequality and
definition of Hausdorff distance.
If dL
H( bX, X) ", then (AL
X , AL
bX
) is (h1, h1)-interleaved in
(0, 1), where h1(r) = r + ".
50. Using the Euclidean Metric
Lemma 2
The pair (AL
bX
, BL
bX
) are (h2, h2)-interleaved in (0, 1), where
h2(r) = r
1 r .
51. Using the Euclidean Metric
Lemma 2
The pair (AL
bX
, BL
bX
) are (h2, h2)-interleaved in (0, 1), where
h2(r) = r
1 r .
x
Key step is to prove the interleaving
for a single point.
53. Approximating L
Lemma 3
If d
bX
H (L, bL) for some < 1, then (BL
bX
, B
bL
bX
) is (h3, h3)-
interleaved in (0, 1), where h3(r) = r
1 .
54. Approximating L
Lemma 3
If d
bX
H (L, bL) for some < 1, then (BL
bX
, B
bL
bX
) is (h3, h3)-
interleaved in (0, 1), where h3(r) = r
1 .
Sampling condition for L is dual to that for X.
!
This shows that an “adaptive sample” of L suffices to give
the desired interleaving.
56. The Big Interleaving
Theorem
Let L, bL ⇢ Rd
and X, bX ⇢ Rd
(L [ bL) be compact sets.
If d
bX
H (L, bL) < 1 and dL
H( bX, X) " < 1, then (AL
X , B
bL
bX
)
are (h4, h5)-interleaved in (0, 1), where
h4(r) = r+"
(1 r ")(1 ) and h5(r) = r
1 r + ".
57. The Big Interleaving
Takeaway: Sufficient conditions on sampling X and L to
guarantee an interleaving.
Theorem
Let L, bL ⇢ Rd
and X, bX ⇢ Rd
(L [ bL) be compact sets.
If d
bX
H (L, bL) < 1 and dL
H( bX, X) " < 1, then (AL
X , B
bL
bX
)
are (h4, h5)-interleaved in (0, 1), where
h4(r) = r+"
(1 r ")(1 ) and h5(r) = r
1 r + ".
60. kNN Sampling
fP,k(x) := distance to kth nearest point of P
Let P be a finite set of points in a bounding region B.
61. kNN Sampling
fP,k(x) := distance to kth nearest point of P
Let P be a finite set of points in a bounding region B.
Goal: Compute a set M such that
M is ⌧-well-spaced, and for all x 2 B
↵fP,k(x) fM,2(x) fP,k(x)
for some constants ↵ and .
62. kNN Sampling
fP,k(x) := distance to kth nearest point of P
Let P be a finite set of points in a bounding region B.
Goal: Compute a set M such that
M is ⌧-well-spaced, and for all x 2 B
↵fP,k(x) fM,2(x) fP,k(x)
for some constants ↵ and .
81. Meshing Point Sets
Aspect Ratio (quality):
Cell Sizing:
Constant Local Complexity:
Optimality and Running time:
v
Rv
rv
Rv
rv
≤ τ
|M| = Θ(|Optimal|)
Running time: O(n log n + |M|)
The degree of the 1-skeleton is 2O(d)
.
Rv = ⇥(fP,2(v))
82. kNN Sampling by Delaunay Refinement
maintain Voronoi/Delaunay incrementally
loop until no more points are added:
if some Voronoi cell V contains k or more points:
insert the farthest corner of V
!
if some Delaunay circumball C contains k or more points:
insert the center of C
!
while any Voronoi cell V has aspect ratio > tau:
insert the farthest corner of V
83. kNN Sampling by Delaunay Refinement
maintain Voronoi/Delaunay incrementally
loop until no more points are added:
if some Voronoi cell V contains k or more points:
insert the farthest corner of V
!
if some Delaunay circumball C contains k or more points:
insert the center of C
!
while any Voronoi cell V has aspect ratio > tau:
insert the farthest corner of V
Based on Sparse Voronoi Refinement [Hudson et al ’06]
84. kNN Sampling by Delaunay Refinement
Key to the analysis: If the aspect ratio is bounded, then any
sufficiently small ball that is not contained entirely in a Voronoi
cell is contained entirely in a Delaunay circumball.
85. Summary
There is a Riemannian metric induced by the
distance to a compact set that relates uniform
and adaptive samples. Interleaving!
!
kNN sampling can be done efficiently with a
simple variant of Delaunay refinement.
86. Summary
There is a Riemannian metric induced by the
distance to a compact set that relates uniform
and adaptive samples. Interleaving!
!
kNN sampling can be done efficiently with a
simple variant of Delaunay refinement.
Thanks.