SlideShare a Scribd company logo
Lecture 6: Minimum encoding ball and Support vector
data description (SVDD)
Stéphane Canu
stephane.canu@litislab.eu
Sao Paulo 2014
May 12, 2014
Plan
1 Support Vector Data Description (SVDD)
SVDD, the smallest enclosing ball problem
The minimum enclosing ball problem with errors
The minimum enclosing ball problem in a RKHS
The two class Support vector data description (SVDD)
The minimum enclosing ball problem [Tax and Duin, 2004]
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 3 / 35
The minimum enclosing ball problem [Tax and Duin, 2004]
the center
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 3 / 35
The minimum enclosing ball problem [Tax and Duin, 2004]
the radius
Given n points, {xi , i = 1, n} .
min
R∈IR,c∈IRd
R2
with xi − c 2
≤ R2
, i = 1, . . . , n
What is that in the convex programming hierarchy?
LP, QP, QCQP, SOCP and SDP
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 3 / 35
The convex programming hierarchy (part of)
LP



min
x
f x
with Ax ≤ d
and 0 ≤ x
QP
min
x
1
2 x Gx + f x
with Ax ≤ d
QCQP



min
x
1
2 x Gx + f x
with x Bi x + ai x ≤ di
i = 1, n
SOCP



min
x
f x
with x − ai ≤ bi x + di
i = 1, n
The convex programming hierarchy?
Model generality: LP < QP < QCQP < SOCP < SDP
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 4 / 35
MEB as a QP in the primal
Theorem (MEB as a QP)
The two following problems are equivalent,
min
R∈IR,c∈IRd
R2
with xi − c 2
≤ R2
, i = 1, . . . , n
min
w,ρ
1
2 w 2
− ρ
with w xi ≥ ρ + 1
2 xi
2
with ρ = 1
2 ( c 2
− R2
) and w = c.
Proof:
xi − c 2
≤ R2
xi
2
− 2xi c + c 2
≤ R2
−2xi c ≤ R2
− xi
2
− c 2
2xi c ≥ −R2
+ xi
2
+ c 2
xi c ≥
1
2
( c 2
− R2
)
ρ
+1
2 xi
2
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 5 / 35
MEB and the one class SVM
SVDD:
min
w,ρ
1
2 w 2
− ρ
with w xi ≥ ρ + 1
2 xi
2
SVDD and linear OCSVM (Supporting Hyperplane)
if ∀i = 1, n, xi
2
= constant, it is the the linear one class SVM (OC SVM)
The linear one class SVM [Schölkopf and Smola, 2002]
min
w,ρ
1
2 w 2
− ρ
with w xi ≥ ρ
with ρ = ρ + 1
2 xi
2
⇒ OC SVM is a particular case of SVDD
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 6 / 35
When ∀i = 1, n, xi
2
= 1
0
c
xi − c 2
≤ R2
⇔ w xi ≥ ρ
with
ρ =
1
2
( c 2
− R + 1)
SVDD and OCSVM
"Belonging to the ball" is also "being above" an hyperplane
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 7 / 35
MEB: KKT
L(c, R, α) = R2
+
n
i=1
αi xi − c 2
− R2
KKT conditionns :
stationarty 2c
n
i=1
αi − 2
n
i=1
αi xi = 0 ← The representer theorem
1 −
n
i=1
αi = 0
primal admiss. xi − c 2
≤ R2
dual admiss. αi ≥ 0 i = 1, n
complementarity αi xi − c 2
− R2
= 0 i = 1, n
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 8 / 35
MEB: KKT
the radius
L(c, R, α) = R2
+
n
i=1
αi xi − c 2
− R2
KKT conditionns :
stationarty 2c
n
i=1
αi − 2
n
i=1
αi xi = 0 ← The representer theorem
1 −
n
i=1
αi = 0
primal admiss. xi − c 2
≤ R2
dual admiss. αi ≥ 0 i = 1, n
complementarity αi xi − c 2
− R2
= 0 i = 1, n
Complementarity tells us: two groups of points
the support vectors xi − c 2
= R2
and the insiders αi = 0
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 8 / 35
MEB: Dual
The representer theorem:
c =
n
i=1
αi xi
n
i=1
αi
=
n
i=1
αi xi
L(α) =
n
i=1
αi xi −
n
j=1
αj xj
2
n
i=1
n
j=1
αi αj xi xj = α Gα and
n
i=1
αi xi xi = α diag(G)
with G = XX the Gram matrix: Gij = xi xj ,



min
α∈IRn
α Gα − α diag(G)
with e α = 1
and 0 ≤ αi , i = 1 . . . n
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 9 / 35
SVDD primal vs. dual
Primal



min
R∈IR,c∈IRd
R2
with xi − c 2 ≤ R2,
i = 1, . . . , n
d + 1 unknown
n constraints
can be recast as a QP
perfect when d << n
Dual



min
α
α Gα − α diag(G)
with e α = 1
and 0 ≤ αi ,
i = 1 . . . n
n unknown with G the pairwise
influence Gram matrix
n box constraints
easy to solve
to be used when d > n
SVDD primal vs. dual
Primal



min
R∈IR,c∈IRd
R2
with xi − c 2 ≤ R2,
i = 1, . . . , n
d + 1 unknown
n constraints
can be recast as a QP
perfect when d << n
Dual



min
α
α Gα − α diag(G)
with e α = 1
and 0 ≤ αi ,
i = 1 . . . n
n unknown with G the pairwise
influence Gram matrix
n box constraints
easy to solve
to be used when d > n
But where is R2?
Looking for R2
min
α
α Gα − α diag(G)
with e α = 1, 0 ≤ αi , i = 1, n
The Lagrangian: L(α, µ, β) = α Gα − α diag(G) + µ(e α − 1) − β α
Stationarity cond.: αL(α, µ, β) = 2Gα − diag(G) + µe − β = 0
The bi dual
min
α
α Gα + µ
with −2Gα + diag(G) ≤ µe
by identification
R2
= µ + α Gα = µ + c 2
µ is the Lagrange multiplier associated with the equality constraint
n
i=1
αi = 1
Also, because of the complementarity condition, if xi is a support vector, then
βi = 0 implies αi > 0 and R2
= xi − c 2
.
Plan
1 Support Vector Data Description (SVDD)
SVDD, the smallest enclosing ball problem
The minimum enclosing ball problem with errors
The minimum enclosing ball problem in a RKHS
The two class Support vector data description (SVDD)
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 12 / 35
The minimum enclosing ball problem with errors
the slack
The same road map:
initial formuation
reformulation (as a QP)
Lagrangian, KKT
dual formulation
bi dual
Initial formulation: for a given C



min
R,a,ξ
R2
+ C
n
i=1
ξi
with xi − c 2
≤ R2
+ ξi , i = 1, . . . , n
and ξi ≥ 0, i = 1, . . . , n
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 13 / 35
The MEB with slack: QP, KKT, dual and R2
SVDD as a QP:



min
w,ρ
1
2 w 2
− ρ + C
2
n
i=1
ξi
with w xi ≥ ρ + 1
2 xi
2
− 1
2 ξi
and ξi ≥ 0,
i = 1, n
again with OC SVM as a particular case.
With G = XX
Dual SVDD:



min
α
α Gα − α diag(G)
with e α = 1
and 0 ≤ αi ≤ C,
i = 1, n
for a given C ≤ 1. If C is larger than one it is useless (it’s the no slack case)
R2
= µ + c c
with µ denoting the Lagrange multiplier associated with the equality
constraint n
i=1 αi = 1.
Variations over SVDD
Adaptive SVDD: the weighted error case for given wi , i = 1, n



min
c∈IRp,R∈IR,ξ∈IRn
R + C
n
i=1
wi ξi
with xi − c 2
≤ R+ξi
ξi ≥ 0 i = 1, n
The dual of this problem is a QP [see for instance Liu et al., 2013]
min
α∈IRn
α XX α − α diag(XX )
with
n
i=1 αi = 1 0 ≤ αi ≤ Cwi i = 1, n
Density induced SVDD (D-SVDD):



min
c∈IRp,R∈IR,ξ∈IRn
R + C
n
i=1
ξi
with wi xi − c 2
≤ R+ξi
ξi ≥ 0 i = 1, n
Plan
1 Support Vector Data Description (SVDD)
SVDD, the smallest enclosing ball problem
The minimum enclosing ball problem with errors
The minimum enclosing ball problem in a RKHS
The two class Support vector data description (SVDD)
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 16 / 35
SVDD in a RKHS
The feature map: IRp
−→ H
c −→ f (•)
xi −→ k(xi , •)
xi − c IRp ≤ R2
−→ k(xi , •) − f (•) 2
H ≤ R2
Kernelized SVDD (in a RKHS) is also a QP



min
f ∈H,R∈IR,ξ∈IRn
R2
+ C
n
i=1
ξi
with k(xi , •) − f (•) 2
H ≤ R2
+ξi i = 1, n
ξi ≥ 0 i = 1, n
SVDD in a RKHS: KKT, Dual and R2
L = R2
+ C
n
i=1
ξi +
n
i=1
αi k(xi , .) − f (.) 2
H − R2
−ξi −
n
i=1
βi ξi
= R2
+ C
n
i=1
ξi +
n
i=1
αi k(xi , xi ) − 2f (xi ) + f 2
H − R2
−ξi −
n
i=1
βi ξi
KKT conditions
Stationarity
2f (.)
n
i=1 αi − 2
n
i=1 αi k(., xi ) = 0 ← The representer theorem
1 −
n
i=1 αi = 0
C − αi − βi = 0
Primal admissibility: k(xi , .) − f (.) 2
≤ R2
+ ξi , ξi ≥ 0
Dual admissibility: αi ≥ 0 , βi ≥ 0
Complementarity
αi k(xi , .) − f (.) 2
− R2
− ξi = 0
βi ξi = 0
SVDD in a RKHS: Dual and R2
L(α) =
n
i=1
αi k(xi , xi ) − 2
n
i=1
f (xi ) + f 2
H with f (.) =
n
j=1
αj k(., xj )
=
n
i=1
αi k(xi , xi ) −
n
i=1
n
j=1
αi αj k(xi , xj )
Gij
Gij = k(xi , xj ) 


min
α
α Gα − α diag(G)
with e α = 1
and 0 ≤ αi ≤ C, i = 1 . . . n
As it is in the linear case:
R2
= µ + f 2
H
with µ denoting the Lagrange multiplier associated with the equality
constraint n
i=1 αi = 1.
SVDD train and val in a RKHS
Train using the dual form (in: G, C; out: α, µ)



min
α
α Gα − α diag(G)
with e α = 1
and 0 ≤ αi ≤ C, i = 1 . . . n
Val with the center in the RKHS: f (.) =
n
i=1 αi k(., xi )
φ(x) = k(x, .) − f (.) 2
H − R2
= k(x, .) 2
H − 2 k(x, .), f (.) H + f (.) 2
H − R2
= k(x, x) − 2f (x) + R2
− µ − R2
= −2f (x) + k(x, x) − µ
= −2
n
i=1
αi k(x, xi ) + k(x, x) − µ
φ(x) = 0 is the decision border
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 20 / 35
An important theoretical result
For a well-calibrated bandwidth,
The SVDD estimates the underlying distribution level set [Vert and Vert,
2006]
The level sets of a probability density function IP(x) are the set
Cp = {x ∈ IRd
| IP(x) ≥ p}
It is well estimated by the empirical minimum volume set
Vp = {x ∈ IRd
| k(x, .) − f (.) 2
H − R2
≥ 0}
The frontiers coincides
SVDD: the generalization error
For a well-calibrated bandwidth,
(x1, . . . , xn) i.i.d. from some fixed but unknown IP(x)
Then [Shawe-Taylor and Cristianini, 2004] with probability at least 1 − δ,
(∀δ ∈]0, 1[), for any margin m > 0
IP k(x, .) − f (.) 2
H ≥ R2
+ m ≤
1
mn
n
i=1
ξi +
6R2
m
√
n
+ 3
ln(2/δ)
2n
Equivalence between SVDD and OCSVM for translation
invariant kernels (diagonal constant kernels)
Theorem
Let H be a RKHS on some domain X endowed with kernel k. If there
exists some constant c such that ∀x ∈ X, k(x, x) = c, then the two
following problems are equivalent,



min
f ,R,ξ
R + C
n
i=1
ξi
with k(xi , .) − f (.) 2
H ≤ R+ξi
ξi ≥ 0 i = 1, n



min
f ,ρ,ξ
1
2 f 2
H − ρ + C
n
i=1
εi
with f (xi ) ≥ ρ − εi
εi ≥ 0 i = 1, n
with ρ = 1
2(c + f 2
H − R) and εi = 1
2ξi .
Proof of the Equivalence between SVDD and OCSVM



min
f ∈H,R∈IR,ξ∈IRn
R + C
n
i=1
ξi
with k(xi , .) − f (.) 2
H ≤ R+ξi , ξi ≥ 0 i = 1, n
since k(xi , .) − f (.) 2
H = k(xi , xi ) + f 2
H − 2f (xi )



min
f ∈H,R∈IR,ξ∈IRn
R + C
n
i=1
ξi
with 2f (xi ) ≥ k(xi , xi ) + f 2
H − R−ξi , ξi ≥ 0 i = 1, n.
Introducing ρ = 1
2 (c + f 2
H − R) that is R = c + f 2
H − 2ρ, and since k(xi , xi )
is constant and equals to c the SVDD problem becomes



min
f ∈H,ρ∈IR,ξ∈IRn
1
2 f 2
H − ρ + C
2
n
i=1
ξi
with f (xi ) ≥ ρ−1
2 ξi , ξi ≥ 0 i = 1, n
leading to the classical one class SVM formulation (OCSVM)



min
f ∈H,ρ∈IR,ξ∈IRn
1
2 f 2
H − ρ + C
n
i=1
εi
with f (xi ) ≥ ρ − εi , εi ≥ 0 i = 1, n
with εi = 1
2 ξi . Note that by putting ν = 1
nC we can get the so called ν
formulation of the OCSVM



min
f ∈H,ρ ∈IR,ξ ∈IRn
1
2 f 2
H − nνρ +
n
i=1
ξi
with f (xi ) ≥ ρ − ξi , ξi ≥ 0 i = 1, n
with f = Cf , ρ = Cρ, and ξ = Cξ.
Duality
Note that the dual of the SVDD is
min
α∈IRn
α Gα − α g
with n
i=1 αi = 1 0 ≤ αi ≤ C i = 1, n
where G is the kernel matrix of general term Gi,j = k(xi , xj ) and g the
diagonal vector such that gi = k(xi , xi ) = c. The dual of the OCSVM is
the following equivalent QP
min
α∈IRn
1
2α Gα
with n
i=1 αi = 1 0 ≤ αi ≤ C i = 1, n
Both dual forms provide the same solution α, but not the same Lagrange
multipliers. ρ is the Lagrange multiplier of the equality constraint of the
dual of the OCSVM and R = c + α Gα − 2ρ. Using the SVDD dual, it
turns out that R = λeq + α Gα where λeq is the Lagrange multiplier of
the equality constraint of the SVDD dual form.
Plan
1 Support Vector Data Description (SVDD)
SVDD, the smallest enclosing ball problem
The minimum enclosing ball problem with errors
The minimum enclosing ball problem in a RKHS
The two class Support vector data description (SVDD)
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 27 / 35
The two class Support vector data description (SVDD)
−4 −3 −2 −1 0 1 2 3
−3
−2
−1
0
1
2
3
4
−4 −3 −2 −1 0 1 2 3
−3
−2
−1
0
1
2
3
4
.



min
c,R,ξ+,ξ−
R2
+C
yi =1
ξ+
i +
yi =−1
ξ−
i
with xi − c 2
≤ R2
+ξ+
i , ξ+
i ≥ 0 i such that yi = 1
and xi − c 2
≥ R2
−ξ−
i , ξ−
i ≥ 0 i such that yi = −1
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 28 / 35
The two class SVDD as a QP



min
c,R,ξ+,ξ−
R2
+C
yi =1
ξ+
i +
yi =−1
ξ−
i
with xi − c 2
≤ R2
+ξ+
i , ξ+
i ≥ 0 i such that yi = 1
and xi − c 2
≥ R2
−ξ−
i , ξ−
i ≥ 0 i such that yi = −1
xi
2
− 2xi c + c 2
≤ R2
+ξ+
i , ξ+
i ≥ 0 i such that yi = 1
xi
2
− 2xi c + c 2
≥ R2
−ξ−
i , ξ−
i ≥ 0 i such that yi = −1
2xi c ≥ c 2
− R2
+ xi
2
−ξ+
i , ξ+
i ≥ 0 i such that yi = 1
−2xi c ≥ − c 2
+ R2
− xi
2
−ξ−
i , ξ−
i ≥ 0 i such that yi = −1
2yi xi c ≥ yi ( c 2
− R2
+ xi
2
)−ξi , ξi ≥ 0 i = 1, n
change variable: ρ = c 2 − R2



min
c,ρ,ξ
c 2
− ρ + C
n
i=1 ξi
with 2yi xi c ≥ yi (ρ − xi
2
)−ξi i = 1, n
and ξi ≥ 0 i = 1, n
The dual of the two class SVDD
Gij = yi yj xi xj
The dual formulation:



min
α∈IRn
α Gα −
n
i=1 αi yi xi
2
with
n
i=1
yi αi = 1
0 ≤ αi ≤ C i = 1, n
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 30 / 35
The two class SVDD vs. one class SVDD
The two class SVDD (left) vs. the one class SVDD (right)
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 31 / 35
Small Sphere and Large Margin (SSLM) approach
Support vector data description with margin [Wu and Ye, 2009]



min
w,R,ξ∈IRn
R2
+C
yi =1
ξ+
i +
yi =−1
ξ−
i
with xi − c 2
≤ R2
− 1+ξ+
i , ξ+
i ≥ 0 i such that yi = 1
and xi − c 2
≥ R2
+ 1−ξ−
i , ξ−
i ≥ 0 i such that yi = −1
xi − c 2
≥ R2
+ 1−ξ−
i and yi = −1 ⇐⇒ yi xi − c 2
≤ yi R2
− 1+ξ−
i
L(c, R, ξ, α, β) = R2
+C
n
i=1
ξi +
n
i=1
αi yi xi − c 2
− yi R2
+ 1−ξi −
n
i=1
βi ξi
−4 −3 −2 −1 0 1 2 3
−3
−2
−1
0
1
2
3
4
SVDD with margin – dual formulation
L(c, R, ξ, α, β) = R2
+C
n
i=1
ξi +
n
i=1
αi yi xi − c 2
− yi R2
+ 1−ξi −
n
i=1
βi ξi
Optimality: c =
n
i=1
αi yi xi ;
n
i=1
αi yi = 1 ; 0 ≤ αi ≤ C
L(α) =
n
i=1
αi yi xi −
n
j=1
αi yj xj
2
+
n
i=1
αi
= −
n
i=1
n
j=1
αj αi yi yj xj xi +
n
i=1
xi
2
yi αi +
n
i=1
αi
Dual SVDD is also a quadratic program
problem D



min
α∈IRn
α Gα − e α − f α
with y α = 1
and 0 ≤ αi ≤ C i = 1, n
with G a symmetric matrix n × n such that Gij = yi yj xj xi and fi = xi
2
yi
Conclusion
Applications
outlier detection
change detection
clustering
large number of classes
variable selection, . . .
A clear path
reformulation (to a standart problem)
KKT
Dual
Bidual
a lot of variations
L2
SVDD
two classes non symmetric
two classes in the symmetric classes (SVM)
the multi classes issue
practical problems with translation invariant
kernels
.
Bibliography
Bo Liu, Yanshan Xiao, Longbing Cao, Zhifeng Hao, and Feiqi Deng.
Svdd-based outlier detection on uncertain data. Knowledge and
information systems, 34(3):597–618, 2013.
B. Schölkopf and A. J. Smola. Learning with Kernels. MIT Press, 2002.
John Shawe-Taylor and Nello Cristianini. Kernel methods for pattern
analysis. Cambridge university press, 2004.
David MJ Tax and Robert PW Duin. Support vector data description.
Machine learning, 54(1):45–66, 2004.
Régis Vert and Jean-Philippe Vert. Consistency and convergence rates of
one-class svms and related algorithms. The Journal of Machine Learning
Research, 7:817–854, 2006.
Mingrui Wu and Jieping Ye. A small sphere and large margin approach for
novelty detection using training data with outliers. Pattern Analysis and
Machine Intelligence, IEEE Transactions on, 31(11):2088–2092, 2009.
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 35 / 35

More Related Content

What's hot

東京都市大学 データ解析入門 5 スパース性と圧縮センシング 2
東京都市大学 データ解析入門 5 スパース性と圧縮センシング 2東京都市大学 データ解析入門 5 スパース性と圧縮センシング 2
東京都市大学 データ解析入門 5 スパース性と圧縮センシング 2
hirokazutanaka
 
はじぱた7章F5up
はじぱた7章F5upはじぱた7章F5up
はじぱた7章F5up
Tyee Z
 
異常検知と変化検知 第4章 近傍法による異常検知
異常検知と変化検知 第4章 近傍法による異常検知異常検知と変化検知 第4章 近傍法による異常検知
異常検知と変化検知 第4章 近傍法による異常検知
Ken'ichi Matsui
 
PRML 12-12.1.4 主成分分析 (PCA) / Principal Component Analysis (PCA)
PRML 12-12.1.4 主成分分析 (PCA) / Principal Component Analysis (PCA)PRML 12-12.1.4 主成分分析 (PCA) / Principal Component Analysis (PCA)
PRML 12-12.1.4 主成分分析 (PCA) / Principal Component Analysis (PCA)
Akihiro Nitta
 
異常検知と変化検知の1~3章をまとめてみた
異常検知と変化検知の1~3章をまとめてみた異常検知と変化検知の1~3章をまとめてみた
異常検知と変化検知の1~3章をまとめてみた
Takahiro Yoshizawa
 
coordinate descent 法について
coordinate descent 法についてcoordinate descent 法について
はじめてのパターン認識 第8章 サポートベクトルマシン
はじめてのパターン認識 第8章 サポートベクトルマシンはじめてのパターン認識 第8章 サポートベクトルマシン
はじめてのパターン認識 第8章 サポートベクトルマシンMotoya Wakiyama
 
スパースモデリング入門
スパースモデリング入門スパースモデリング入門
スパースモデリング入門
Hideo Terada
 
Safe Reinforcement Learning
Safe Reinforcement LearningSafe Reinforcement Learning
Safe Reinforcement Learning
Dongmin Lee
 
Sharpness-Aware Minimization for Efficiently Improving Generalization
Sharpness-Aware Minimization for Efficiently Improving GeneralizationSharpness-Aware Minimization for Efficiently Improving Generalization
Sharpness-Aware Minimization for Efficiently Improving Generalization
taeseon ryu
 
[DL輪読会]Scalable Training of Inference Networks for Gaussian-Process Models
[DL輪読会]Scalable Training of Inference Networks for Gaussian-Process Models[DL輪読会]Scalable Training of Inference Networks for Gaussian-Process Models
[DL輪読会]Scalable Training of Inference Networks for Gaussian-Process Models
Deep Learning JP
 
Efficient Estimation for High Similarities using Odd Sketches
Efficient Estimation for High Similarities using Odd Sketches Efficient Estimation for High Similarities using Odd Sketches
Efficient Estimation for High Similarities using Odd Sketches
AXEL FOTSO
 
正準相関分析
正準相関分析正準相関分析
正準相関分析
Akisato Kimura
 
【材料力学】仮想仕事の原理 最小ポテンシャルエネルギーの原理 (II-07 2018)
【材料力学】仮想仕事の原理 最小ポテンシャルエネルギーの原理  (II-07 2018)【材料力学】仮想仕事の原理 最小ポテンシャルエネルギーの原理  (II-07 2018)
【材料力学】仮想仕事の原理 最小ポテンシャルエネルギーの原理 (II-07 2018)
Kazuhiro Suga
 
Anomaly detection char10
Anomaly detection char10Anomaly detection char10
Anomaly detection char10
natsup
 
R実践 機械学習による異常検知 02
R実践 機械学習による異常検知 02R実践 機械学習による異常検知 02
R実践 機械学習による異常検知 02
akira_11
 
計算論的学習理論入門 -PAC学習とかVC次元とか-
計算論的学習理論入門 -PAC学習とかVC次元とか-計算論的学習理論入門 -PAC学習とかVC次元とか-
計算論的学習理論入門 -PAC学習とかVC次元とか-
sleepy_yoshi
 
卒論プレゼンテーション -DRAFT-
卒論プレゼンテーション -DRAFT-卒論プレゼンテーション -DRAFT-
卒論プレゼンテーション -DRAFT-Tomoshige Nakamura
 
Manifold learning with application to object recognition
Manifold learning with application to object recognitionManifold learning with application to object recognition
Manifold learning with application to object recognitionzukun
 

What's hot (20)

東京都市大学 データ解析入門 5 スパース性と圧縮センシング 2
東京都市大学 データ解析入門 5 スパース性と圧縮センシング 2東京都市大学 データ解析入門 5 スパース性と圧縮センシング 2
東京都市大学 データ解析入門 5 スパース性と圧縮センシング 2
 
はじぱた7章F5up
はじぱた7章F5upはじぱた7章F5up
はじぱた7章F5up
 
マーク付き点過程
マーク付き点過程マーク付き点過程
マーク付き点過程
 
異常検知と変化検知 第4章 近傍法による異常検知
異常検知と変化検知 第4章 近傍法による異常検知異常検知と変化検知 第4章 近傍法による異常検知
異常検知と変化検知 第4章 近傍法による異常検知
 
PRML 12-12.1.4 主成分分析 (PCA) / Principal Component Analysis (PCA)
PRML 12-12.1.4 主成分分析 (PCA) / Principal Component Analysis (PCA)PRML 12-12.1.4 主成分分析 (PCA) / Principal Component Analysis (PCA)
PRML 12-12.1.4 主成分分析 (PCA) / Principal Component Analysis (PCA)
 
異常検知と変化検知の1~3章をまとめてみた
異常検知と変化検知の1~3章をまとめてみた異常検知と変化検知の1~3章をまとめてみた
異常検知と変化検知の1~3章をまとめてみた
 
coordinate descent 法について
coordinate descent 法についてcoordinate descent 法について
coordinate descent 法について
 
はじめてのパターン認識 第8章 サポートベクトルマシン
はじめてのパターン認識 第8章 サポートベクトルマシンはじめてのパターン認識 第8章 サポートベクトルマシン
はじめてのパターン認識 第8章 サポートベクトルマシン
 
スパースモデリング入門
スパースモデリング入門スパースモデリング入門
スパースモデリング入門
 
Safe Reinforcement Learning
Safe Reinforcement LearningSafe Reinforcement Learning
Safe Reinforcement Learning
 
Sharpness-Aware Minimization for Efficiently Improving Generalization
Sharpness-Aware Minimization for Efficiently Improving GeneralizationSharpness-Aware Minimization for Efficiently Improving Generalization
Sharpness-Aware Minimization for Efficiently Improving Generalization
 
[DL輪読会]Scalable Training of Inference Networks for Gaussian-Process Models
[DL輪読会]Scalable Training of Inference Networks for Gaussian-Process Models[DL輪読会]Scalable Training of Inference Networks for Gaussian-Process Models
[DL輪読会]Scalable Training of Inference Networks for Gaussian-Process Models
 
Efficient Estimation for High Similarities using Odd Sketches
Efficient Estimation for High Similarities using Odd Sketches Efficient Estimation for High Similarities using Odd Sketches
Efficient Estimation for High Similarities using Odd Sketches
 
正準相関分析
正準相関分析正準相関分析
正準相関分析
 
【材料力学】仮想仕事の原理 最小ポテンシャルエネルギーの原理 (II-07 2018)
【材料力学】仮想仕事の原理 最小ポテンシャルエネルギーの原理  (II-07 2018)【材料力学】仮想仕事の原理 最小ポテンシャルエネルギーの原理  (II-07 2018)
【材料力学】仮想仕事の原理 最小ポテンシャルエネルギーの原理 (II-07 2018)
 
Anomaly detection char10
Anomaly detection char10Anomaly detection char10
Anomaly detection char10
 
R実践 機械学習による異常検知 02
R実践 機械学習による異常検知 02R実践 機械学習による異常検知 02
R実践 機械学習による異常検知 02
 
計算論的学習理論入門 -PAC学習とかVC次元とか-
計算論的学習理論入門 -PAC学習とかVC次元とか-計算論的学習理論入門 -PAC学習とかVC次元とか-
計算論的学習理論入門 -PAC学習とかVC次元とか-
 
卒論プレゼンテーション -DRAFT-
卒論プレゼンテーション -DRAFT-卒論プレゼンテーション -DRAFT-
卒論プレゼンテーション -DRAFT-
 
Manifold learning with application to object recognition
Manifold learning with application to object recognitionManifold learning with application to object recognition
Manifold learning with application to object recognition
 

Viewers also liked

Lecture10 outilier l0_svdd
Lecture10 outilier l0_svddLecture10 outilier l0_svdd
Lecture10 outilier l0_svddStéphane Canu
 
Support Vector Machines for Classification
Support Vector Machines for ClassificationSupport Vector Machines for Classification
Support Vector Machines for Classification
Prakash Pimpale
 
Anomaly Detection using SIngle Class SVM with Gaussian Kernel
Anomaly Detection using SIngle Class SVM with Gaussian KernelAnomaly Detection using SIngle Class SVM with Gaussian Kernel
Anomaly Detection using SIngle Class SVM with Gaussian Kernel
Anoop Vasant Kumar
 
ODSC - Neural Networks on AWS Lambda
ODSC - Neural Networks on AWS LambdaODSC - Neural Networks on AWS Lambda
ODSC - Neural Networks on AWS Lambda
Anoop Vasant Kumar
 

Viewers also liked (6)

Lecture10 outilier l0_svdd
Lecture10 outilier l0_svddLecture10 outilier l0_svdd
Lecture10 outilier l0_svdd
 
Support Vector Machines for Classification
Support Vector Machines for ClassificationSupport Vector Machines for Classification
Support Vector Machines for Classification
 
thesis
thesisthesis
thesis
 
Anomaly Detection using SIngle Class SVM with Gaussian Kernel
Anomaly Detection using SIngle Class SVM with Gaussian KernelAnomaly Detection using SIngle Class SVM with Gaussian Kernel
Anomaly Detection using SIngle Class SVM with Gaussian Kernel
 
ODSC - Neural Networks on AWS Lambda
ODSC - Neural Networks on AWS LambdaODSC - Neural Networks on AWS Lambda
ODSC - Neural Networks on AWS Lambda
 
Human values & professional ethics
Human values & professional ethicsHuman values & professional ethics
Human values & professional ethics
 

Similar to Lecture6 svdd

Trilinear embedding for divergence-form operators
Trilinear embedding for divergence-form operatorsTrilinear embedding for divergence-form operators
Trilinear embedding for divergence-form operators
VjekoslavKovac1
 
Bayesian inference on mixtures
Bayesian inference on mixturesBayesian inference on mixtures
Bayesian inference on mixtures
Christian Robert
 
Positive and negative solutions of a boundary value problem for a fractional ...
Positive and negative solutions of a boundary value problem for a fractional ...Positive and negative solutions of a boundary value problem for a fractional ...
Positive and negative solutions of a boundary value problem for a fractional ...
journal ijrtem
 
A Szemeredi-type theorem for subsets of the unit cube
A Szemeredi-type theorem for subsets of the unit cubeA Szemeredi-type theorem for subsets of the unit cube
A Szemeredi-type theorem for subsets of the unit cube
VjekoslavKovac1
 
Solutions Manual for Calculus Early Transcendentals 10th Edition by Anton
Solutions Manual for Calculus Early Transcendentals 10th Edition by AntonSolutions Manual for Calculus Early Transcendentals 10th Edition by Anton
Solutions Manual for Calculus Early Transcendentals 10th Edition by Anton
Pamelaew
 
Calculus Early Transcendentals 10th Edition Anton Solutions Manual
Calculus Early Transcendentals 10th Edition Anton Solutions ManualCalculus Early Transcendentals 10th Edition Anton Solutions Manual
Calculus Early Transcendentals 10th Edition Anton Solutions Manual
nodyligomi
 
Practical and Worst-Case Efficient Apportionment
Practical and Worst-Case Efficient ApportionmentPractical and Worst-Case Efficient Apportionment
Practical and Worst-Case Efficient Apportionment
Raphael Reitzig
 
Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...
Alexander Litvinenko
 
QMC Error SAMSI Tutorial Aug 2017
QMC Error SAMSI Tutorial Aug 2017QMC Error SAMSI Tutorial Aug 2017
QMC Error SAMSI Tutorial Aug 2017
Fred J. Hickernell
 
A Proof of the Generalized Riemann Hypothesis
A Proof of the Generalized Riemann HypothesisA Proof of the Generalized Riemann Hypothesis
A Proof of the Generalized Riemann Hypothesis
Charaf Ech-Chatbi
 
A Proof of the Generalized Riemann Hypothesis
A Proof of the Generalized Riemann HypothesisA Proof of the Generalized Riemann Hypothesis
A Proof of the Generalized Riemann Hypothesis
Charaf Ech-Chatbi
 
Classification with mixtures of curved Mahalanobis metrics
Classification with mixtures of curved Mahalanobis metricsClassification with mixtures of curved Mahalanobis metrics
Classification with mixtures of curved Mahalanobis metrics
Frank Nielsen
 
Nonconvex Compressed Sensing with the Sum-of-Squares Method
Nonconvex Compressed Sensing with the Sum-of-Squares MethodNonconvex Compressed Sensing with the Sum-of-Squares Method
Nonconvex Compressed Sensing with the Sum-of-Squares Method
Tasuku Soma
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
The Statistical and Applied Mathematical Sciences Institute
 
IIT Jam math 2016 solutions BY Trajectoryeducation
IIT Jam math 2016 solutions BY TrajectoryeducationIIT Jam math 2016 solutions BY Trajectoryeducation
IIT Jam math 2016 solutions BY Trajectoryeducation
Dev Singh
 
Doubly Accelerated Stochastic Variance Reduced Gradient Methods for Regulariz...
Doubly Accelerated Stochastic Variance Reduced Gradient Methods for Regulariz...Doubly Accelerated Stochastic Variance Reduced Gradient Methods for Regulariz...
Doubly Accelerated Stochastic Variance Reduced Gradient Methods for Regulariz...
Tomoya Murata
 
Lecture3 linear svm_with_slack
Lecture3 linear svm_with_slackLecture3 linear svm_with_slack
Lecture3 linear svm_with_slackStéphane Canu
 
Some Examples of Scaling Sets
Some Examples of Scaling SetsSome Examples of Scaling Sets
Some Examples of Scaling Sets
VjekoslavKovac1
 

Similar to Lecture6 svdd (20)

Trilinear embedding for divergence-form operators
Trilinear embedding for divergence-form operatorsTrilinear embedding for divergence-form operators
Trilinear embedding for divergence-form operators
 
Bayesian inference on mixtures
Bayesian inference on mixturesBayesian inference on mixtures
Bayesian inference on mixtures
 
Positive and negative solutions of a boundary value problem for a fractional ...
Positive and negative solutions of a boundary value problem for a fractional ...Positive and negative solutions of a boundary value problem for a fractional ...
Positive and negative solutions of a boundary value problem for a fractional ...
 
A Szemeredi-type theorem for subsets of the unit cube
A Szemeredi-type theorem for subsets of the unit cubeA Szemeredi-type theorem for subsets of the unit cube
A Szemeredi-type theorem for subsets of the unit cube
 
Solutions Manual for Calculus Early Transcendentals 10th Edition by Anton
Solutions Manual for Calculus Early Transcendentals 10th Edition by AntonSolutions Manual for Calculus Early Transcendentals 10th Edition by Anton
Solutions Manual for Calculus Early Transcendentals 10th Edition by Anton
 
Calculus Early Transcendentals 10th Edition Anton Solutions Manual
Calculus Early Transcendentals 10th Edition Anton Solutions ManualCalculus Early Transcendentals 10th Edition Anton Solutions Manual
Calculus Early Transcendentals 10th Edition Anton Solutions Manual
 
Practical and Worst-Case Efficient Apportionment
Practical and Worst-Case Efficient ApportionmentPractical and Worst-Case Efficient Apportionment
Practical and Worst-Case Efficient Apportionment
 
Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...
 
QMC Error SAMSI Tutorial Aug 2017
QMC Error SAMSI Tutorial Aug 2017QMC Error SAMSI Tutorial Aug 2017
QMC Error SAMSI Tutorial Aug 2017
 
A Proof of the Generalized Riemann Hypothesis
A Proof of the Generalized Riemann HypothesisA Proof of the Generalized Riemann Hypothesis
A Proof of the Generalized Riemann Hypothesis
 
A Proof of the Generalized Riemann Hypothesis
A Proof of the Generalized Riemann HypothesisA Proof of the Generalized Riemann Hypothesis
A Proof of the Generalized Riemann Hypothesis
 
Classification with mixtures of curved Mahalanobis metrics
Classification with mixtures of curved Mahalanobis metricsClassification with mixtures of curved Mahalanobis metrics
Classification with mixtures of curved Mahalanobis metrics
 
Nonconvex Compressed Sensing with the Sum-of-Squares Method
Nonconvex Compressed Sensing with the Sum-of-Squares MethodNonconvex Compressed Sensing with the Sum-of-Squares Method
Nonconvex Compressed Sensing with the Sum-of-Squares Method
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
IIT Jam math 2016 solutions BY Trajectoryeducation
IIT Jam math 2016 solutions BY TrajectoryeducationIIT Jam math 2016 solutions BY Trajectoryeducation
IIT Jam math 2016 solutions BY Trajectoryeducation
 
Doubly Accelerated Stochastic Variance Reduced Gradient Methods for Regulariz...
Doubly Accelerated Stochastic Variance Reduced Gradient Methods for Regulariz...Doubly Accelerated Stochastic Variance Reduced Gradient Methods for Regulariz...
Doubly Accelerated Stochastic Variance Reduced Gradient Methods for Regulariz...
 
Shell theory
Shell theoryShell theory
Shell theory
 
Maths04
Maths04Maths04
Maths04
 
Lecture3 linear svm_with_slack
Lecture3 linear svm_with_slackLecture3 linear svm_with_slack
Lecture3 linear svm_with_slack
 
Some Examples of Scaling Sets
Some Examples of Scaling SetsSome Examples of Scaling Sets
Some Examples of Scaling Sets
 

More from Stéphane Canu

Lecture 2: linear SVM in the Dual
Lecture 2: linear SVM in the DualLecture 2: linear SVM in the Dual
Lecture 2: linear SVM in the Dual
Stéphane Canu
 
Lecture8 multi class_svm
Lecture8 multi class_svmLecture8 multi class_svm
Lecture8 multi class_svmStéphane Canu
 
Lecture7 cross validation
Lecture7 cross validationLecture7 cross validation
Lecture7 cross validationStéphane Canu
 
Lecture4 kenrels functions_rkhs
Lecture4 kenrels functions_rkhsLecture4 kenrels functions_rkhs
Lecture4 kenrels functions_rkhsStéphane Canu
 
Lecture 2: linear SVM in the dual
Lecture 2: linear SVM in the dualLecture 2: linear SVM in the dual
Lecture 2: linear SVM in the dual
Stéphane Canu
 
Lecture 1: linear SVM in the primal
Lecture 1: linear SVM in the primalLecture 1: linear SVM in the primal
Lecture 1: linear SVM in the primal
Stéphane Canu
 
Lecture9 multi kernel_svm
Lecture9 multi kernel_svmLecture9 multi kernel_svm
Lecture9 multi kernel_svmStéphane Canu
 
Main recsys factorisation
Main recsys factorisationMain recsys factorisation
Main recsys factorisationStéphane Canu
 

More from Stéphane Canu (9)

Lecture 2: linear SVM in the Dual
Lecture 2: linear SVM in the DualLecture 2: linear SVM in the Dual
Lecture 2: linear SVM in the Dual
 
Lecture8 multi class_svm
Lecture8 multi class_svmLecture8 multi class_svm
Lecture8 multi class_svm
 
Lecture7 cross validation
Lecture7 cross validationLecture7 cross validation
Lecture7 cross validation
 
Lecture5 kernel svm
Lecture5 kernel svmLecture5 kernel svm
Lecture5 kernel svm
 
Lecture4 kenrels functions_rkhs
Lecture4 kenrels functions_rkhsLecture4 kenrels functions_rkhs
Lecture4 kenrels functions_rkhs
 
Lecture 2: linear SVM in the dual
Lecture 2: linear SVM in the dualLecture 2: linear SVM in the dual
Lecture 2: linear SVM in the dual
 
Lecture 1: linear SVM in the primal
Lecture 1: linear SVM in the primalLecture 1: linear SVM in the primal
Lecture 1: linear SVM in the primal
 
Lecture9 multi kernel_svm
Lecture9 multi kernel_svmLecture9 multi kernel_svm
Lecture9 multi kernel_svm
 
Main recsys factorisation
Main recsys factorisationMain recsys factorisation
Main recsys factorisation
 

Recently uploaded

Leading Change strategies and insights for effective change management pdf 1.pdf
Leading Change strategies and insights for effective change management pdf 1.pdfLeading Change strategies and insights for effective change management pdf 1.pdf
Leading Change strategies and insights for effective change management pdf 1.pdf
OnBoard
 
From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
Product School
 
Generating a custom Ruby SDK for your web service or Rails API using Smithy
Generating a custom Ruby SDK for your web service or Rails API using SmithyGenerating a custom Ruby SDK for your web service or Rails API using Smithy
Generating a custom Ruby SDK for your web service or Rails API using Smithy
g2nightmarescribd
 
UiPath Test Automation using UiPath Test Suite series, part 3
UiPath Test Automation using UiPath Test Suite series, part 3UiPath Test Automation using UiPath Test Suite series, part 3
UiPath Test Automation using UiPath Test Suite series, part 3
DianaGray10
 
FIDO Alliance Osaka Seminar: Overview.pdf
FIDO Alliance Osaka Seminar: Overview.pdfFIDO Alliance Osaka Seminar: Overview.pdf
FIDO Alliance Osaka Seminar: Overview.pdf
FIDO Alliance
 
Key Trends Shaping the Future of Infrastructure.pdf
Key Trends Shaping the Future of Infrastructure.pdfKey Trends Shaping the Future of Infrastructure.pdf
Key Trends Shaping the Future of Infrastructure.pdf
Cheryl Hung
 
When stars align: studies in data quality, knowledge graphs, and machine lear...
When stars align: studies in data quality, knowledge graphs, and machine lear...When stars align: studies in data quality, knowledge graphs, and machine lear...
When stars align: studies in data quality, knowledge graphs, and machine lear...
Elena Simperl
 
The Future of Platform Engineering
The Future of Platform EngineeringThe Future of Platform Engineering
The Future of Platform Engineering
Jemma Hussein Allen
 
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...
DanBrown980551
 
AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...
AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...
AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...
Product School
 
Transcript: Selling digital books in 2024: Insights from industry leaders - T...
Transcript: Selling digital books in 2024: Insights from industry leaders - T...Transcript: Selling digital books in 2024: Insights from industry leaders - T...
Transcript: Selling digital books in 2024: Insights from industry leaders - T...
BookNet Canada
 
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...
Jeffrey Haguewood
 
UiPath Test Automation using UiPath Test Suite series, part 4
UiPath Test Automation using UiPath Test Suite series, part 4UiPath Test Automation using UiPath Test Suite series, part 4
UiPath Test Automation using UiPath Test Suite series, part 4
DianaGray10
 
GraphRAG is All You need? LLM & Knowledge Graph
GraphRAG is All You need? LLM & Knowledge GraphGraphRAG is All You need? LLM & Knowledge Graph
GraphRAG is All You need? LLM & Knowledge Graph
Guy Korland
 
FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf
FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdfFIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf
FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf
FIDO Alliance
 
Elevating Tactical DDD Patterns Through Object Calisthenics
Elevating Tactical DDD Patterns Through Object CalisthenicsElevating Tactical DDD Patterns Through Object Calisthenics
Elevating Tactical DDD Patterns Through Object Calisthenics
Dorra BARTAGUIZ
 
Empowering NextGen Mobility via Large Action Model Infrastructure (LAMI): pav...
Empowering NextGen Mobility via Large Action Model Infrastructure (LAMI): pav...Empowering NextGen Mobility via Large Action Model Infrastructure (LAMI): pav...
Empowering NextGen Mobility via Large Action Model Infrastructure (LAMI): pav...
Thierry Lestable
 
Assuring Contact Center Experiences for Your Customers With ThousandEyes
Assuring Contact Center Experiences for Your Customers With ThousandEyesAssuring Contact Center Experiences for Your Customers With ThousandEyes
Assuring Contact Center Experiences for Your Customers With ThousandEyes
ThousandEyes
 
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...
UiPathCommunity
 
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered Quality
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualitySoftware Delivery At the Speed of AI: Inflectra Invests In AI-Powered Quality
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered Quality
Inflectra
 

Recently uploaded (20)

Leading Change strategies and insights for effective change management pdf 1.pdf
Leading Change strategies and insights for effective change management pdf 1.pdfLeading Change strategies and insights for effective change management pdf 1.pdf
Leading Change strategies and insights for effective change management pdf 1.pdf
 
From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
 
Generating a custom Ruby SDK for your web service or Rails API using Smithy
Generating a custom Ruby SDK for your web service or Rails API using SmithyGenerating a custom Ruby SDK for your web service or Rails API using Smithy
Generating a custom Ruby SDK for your web service or Rails API using Smithy
 
UiPath Test Automation using UiPath Test Suite series, part 3
UiPath Test Automation using UiPath Test Suite series, part 3UiPath Test Automation using UiPath Test Suite series, part 3
UiPath Test Automation using UiPath Test Suite series, part 3
 
FIDO Alliance Osaka Seminar: Overview.pdf
FIDO Alliance Osaka Seminar: Overview.pdfFIDO Alliance Osaka Seminar: Overview.pdf
FIDO Alliance Osaka Seminar: Overview.pdf
 
Key Trends Shaping the Future of Infrastructure.pdf
Key Trends Shaping the Future of Infrastructure.pdfKey Trends Shaping the Future of Infrastructure.pdf
Key Trends Shaping the Future of Infrastructure.pdf
 
When stars align: studies in data quality, knowledge graphs, and machine lear...
When stars align: studies in data quality, knowledge graphs, and machine lear...When stars align: studies in data quality, knowledge graphs, and machine lear...
When stars align: studies in data quality, knowledge graphs, and machine lear...
 
The Future of Platform Engineering
The Future of Platform EngineeringThe Future of Platform Engineering
The Future of Platform Engineering
 
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...
 
AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...
AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...
AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...
 
Transcript: Selling digital books in 2024: Insights from industry leaders - T...
Transcript: Selling digital books in 2024: Insights from industry leaders - T...Transcript: Selling digital books in 2024: Insights from industry leaders - T...
Transcript: Selling digital books in 2024: Insights from industry leaders - T...
 
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...
 
UiPath Test Automation using UiPath Test Suite series, part 4
UiPath Test Automation using UiPath Test Suite series, part 4UiPath Test Automation using UiPath Test Suite series, part 4
UiPath Test Automation using UiPath Test Suite series, part 4
 
GraphRAG is All You need? LLM & Knowledge Graph
GraphRAG is All You need? LLM & Knowledge GraphGraphRAG is All You need? LLM & Knowledge Graph
GraphRAG is All You need? LLM & Knowledge Graph
 
FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf
FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdfFIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf
FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf
 
Elevating Tactical DDD Patterns Through Object Calisthenics
Elevating Tactical DDD Patterns Through Object CalisthenicsElevating Tactical DDD Patterns Through Object Calisthenics
Elevating Tactical DDD Patterns Through Object Calisthenics
 
Empowering NextGen Mobility via Large Action Model Infrastructure (LAMI): pav...
Empowering NextGen Mobility via Large Action Model Infrastructure (LAMI): pav...Empowering NextGen Mobility via Large Action Model Infrastructure (LAMI): pav...
Empowering NextGen Mobility via Large Action Model Infrastructure (LAMI): pav...
 
Assuring Contact Center Experiences for Your Customers With ThousandEyes
Assuring Contact Center Experiences for Your Customers With ThousandEyesAssuring Contact Center Experiences for Your Customers With ThousandEyes
Assuring Contact Center Experiences for Your Customers With ThousandEyes
 
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...
 
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered Quality
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualitySoftware Delivery At the Speed of AI: Inflectra Invests In AI-Powered Quality
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered Quality
 

Lecture6 svdd

  • 1. Lecture 6: Minimum encoding ball and Support vector data description (SVDD) Stéphane Canu stephane.canu@litislab.eu Sao Paulo 2014 May 12, 2014
  • 2. Plan 1 Support Vector Data Description (SVDD) SVDD, the smallest enclosing ball problem The minimum enclosing ball problem with errors The minimum enclosing ball problem in a RKHS The two class Support vector data description (SVDD)
  • 3. The minimum enclosing ball problem [Tax and Duin, 2004] Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 3 / 35
  • 4. The minimum enclosing ball problem [Tax and Duin, 2004] the center Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 3 / 35
  • 5. The minimum enclosing ball problem [Tax and Duin, 2004] the radius Given n points, {xi , i = 1, n} . min R∈IR,c∈IRd R2 with xi − c 2 ≤ R2 , i = 1, . . . , n What is that in the convex programming hierarchy? LP, QP, QCQP, SOCP and SDP Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 3 / 35
  • 6. The convex programming hierarchy (part of) LP    min x f x with Ax ≤ d and 0 ≤ x QP min x 1 2 x Gx + f x with Ax ≤ d QCQP    min x 1 2 x Gx + f x with x Bi x + ai x ≤ di i = 1, n SOCP    min x f x with x − ai ≤ bi x + di i = 1, n The convex programming hierarchy? Model generality: LP < QP < QCQP < SOCP < SDP Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 4 / 35
  • 7. MEB as a QP in the primal Theorem (MEB as a QP) The two following problems are equivalent, min R∈IR,c∈IRd R2 with xi − c 2 ≤ R2 , i = 1, . . . , n min w,ρ 1 2 w 2 − ρ with w xi ≥ ρ + 1 2 xi 2 with ρ = 1 2 ( c 2 − R2 ) and w = c. Proof: xi − c 2 ≤ R2 xi 2 − 2xi c + c 2 ≤ R2 −2xi c ≤ R2 − xi 2 − c 2 2xi c ≥ −R2 + xi 2 + c 2 xi c ≥ 1 2 ( c 2 − R2 ) ρ +1 2 xi 2 Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 5 / 35
  • 8. MEB and the one class SVM SVDD: min w,ρ 1 2 w 2 − ρ with w xi ≥ ρ + 1 2 xi 2 SVDD and linear OCSVM (Supporting Hyperplane) if ∀i = 1, n, xi 2 = constant, it is the the linear one class SVM (OC SVM) The linear one class SVM [Schölkopf and Smola, 2002] min w,ρ 1 2 w 2 − ρ with w xi ≥ ρ with ρ = ρ + 1 2 xi 2 ⇒ OC SVM is a particular case of SVDD Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 6 / 35
  • 9. When ∀i = 1, n, xi 2 = 1 0 c xi − c 2 ≤ R2 ⇔ w xi ≥ ρ with ρ = 1 2 ( c 2 − R + 1) SVDD and OCSVM "Belonging to the ball" is also "being above" an hyperplane Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 7 / 35
  • 10. MEB: KKT L(c, R, α) = R2 + n i=1 αi xi − c 2 − R2 KKT conditionns : stationarty 2c n i=1 αi − 2 n i=1 αi xi = 0 ← The representer theorem 1 − n i=1 αi = 0 primal admiss. xi − c 2 ≤ R2 dual admiss. αi ≥ 0 i = 1, n complementarity αi xi − c 2 − R2 = 0 i = 1, n Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 8 / 35
  • 11. MEB: KKT the radius L(c, R, α) = R2 + n i=1 αi xi − c 2 − R2 KKT conditionns : stationarty 2c n i=1 αi − 2 n i=1 αi xi = 0 ← The representer theorem 1 − n i=1 αi = 0 primal admiss. xi − c 2 ≤ R2 dual admiss. αi ≥ 0 i = 1, n complementarity αi xi − c 2 − R2 = 0 i = 1, n Complementarity tells us: two groups of points the support vectors xi − c 2 = R2 and the insiders αi = 0 Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 8 / 35
  • 12. MEB: Dual The representer theorem: c = n i=1 αi xi n i=1 αi = n i=1 αi xi L(α) = n i=1 αi xi − n j=1 αj xj 2 n i=1 n j=1 αi αj xi xj = α Gα and n i=1 αi xi xi = α diag(G) with G = XX the Gram matrix: Gij = xi xj ,    min α∈IRn α Gα − α diag(G) with e α = 1 and 0 ≤ αi , i = 1 . . . n Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 9 / 35
  • 13. SVDD primal vs. dual Primal    min R∈IR,c∈IRd R2 with xi − c 2 ≤ R2, i = 1, . . . , n d + 1 unknown n constraints can be recast as a QP perfect when d << n Dual    min α α Gα − α diag(G) with e α = 1 and 0 ≤ αi , i = 1 . . . n n unknown with G the pairwise influence Gram matrix n box constraints easy to solve to be used when d > n
  • 14. SVDD primal vs. dual Primal    min R∈IR,c∈IRd R2 with xi − c 2 ≤ R2, i = 1, . . . , n d + 1 unknown n constraints can be recast as a QP perfect when d << n Dual    min α α Gα − α diag(G) with e α = 1 and 0 ≤ αi , i = 1 . . . n n unknown with G the pairwise influence Gram matrix n box constraints easy to solve to be used when d > n But where is R2?
  • 15. Looking for R2 min α α Gα − α diag(G) with e α = 1, 0 ≤ αi , i = 1, n The Lagrangian: L(α, µ, β) = α Gα − α diag(G) + µ(e α − 1) − β α Stationarity cond.: αL(α, µ, β) = 2Gα − diag(G) + µe − β = 0 The bi dual min α α Gα + µ with −2Gα + diag(G) ≤ µe by identification R2 = µ + α Gα = µ + c 2 µ is the Lagrange multiplier associated with the equality constraint n i=1 αi = 1 Also, because of the complementarity condition, if xi is a support vector, then βi = 0 implies αi > 0 and R2 = xi − c 2 .
  • 16. Plan 1 Support Vector Data Description (SVDD) SVDD, the smallest enclosing ball problem The minimum enclosing ball problem with errors The minimum enclosing ball problem in a RKHS The two class Support vector data description (SVDD) Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 12 / 35
  • 17. The minimum enclosing ball problem with errors the slack The same road map: initial formuation reformulation (as a QP) Lagrangian, KKT dual formulation bi dual Initial formulation: for a given C    min R,a,ξ R2 + C n i=1 ξi with xi − c 2 ≤ R2 + ξi , i = 1, . . . , n and ξi ≥ 0, i = 1, . . . , n Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 13 / 35
  • 18. The MEB with slack: QP, KKT, dual and R2 SVDD as a QP:    min w,ρ 1 2 w 2 − ρ + C 2 n i=1 ξi with w xi ≥ ρ + 1 2 xi 2 − 1 2 ξi and ξi ≥ 0, i = 1, n again with OC SVM as a particular case. With G = XX Dual SVDD:    min α α Gα − α diag(G) with e α = 1 and 0 ≤ αi ≤ C, i = 1, n for a given C ≤ 1. If C is larger than one it is useless (it’s the no slack case) R2 = µ + c c with µ denoting the Lagrange multiplier associated with the equality constraint n i=1 αi = 1.
  • 19. Variations over SVDD Adaptive SVDD: the weighted error case for given wi , i = 1, n    min c∈IRp,R∈IR,ξ∈IRn R + C n i=1 wi ξi with xi − c 2 ≤ R+ξi ξi ≥ 0 i = 1, n The dual of this problem is a QP [see for instance Liu et al., 2013] min α∈IRn α XX α − α diag(XX ) with n i=1 αi = 1 0 ≤ αi ≤ Cwi i = 1, n Density induced SVDD (D-SVDD):    min c∈IRp,R∈IR,ξ∈IRn R + C n i=1 ξi with wi xi − c 2 ≤ R+ξi ξi ≥ 0 i = 1, n
  • 20. Plan 1 Support Vector Data Description (SVDD) SVDD, the smallest enclosing ball problem The minimum enclosing ball problem with errors The minimum enclosing ball problem in a RKHS The two class Support vector data description (SVDD) Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 16 / 35
  • 21. SVDD in a RKHS The feature map: IRp −→ H c −→ f (•) xi −→ k(xi , •) xi − c IRp ≤ R2 −→ k(xi , •) − f (•) 2 H ≤ R2 Kernelized SVDD (in a RKHS) is also a QP    min f ∈H,R∈IR,ξ∈IRn R2 + C n i=1 ξi with k(xi , •) − f (•) 2 H ≤ R2 +ξi i = 1, n ξi ≥ 0 i = 1, n
  • 22. SVDD in a RKHS: KKT, Dual and R2 L = R2 + C n i=1 ξi + n i=1 αi k(xi , .) − f (.) 2 H − R2 −ξi − n i=1 βi ξi = R2 + C n i=1 ξi + n i=1 αi k(xi , xi ) − 2f (xi ) + f 2 H − R2 −ξi − n i=1 βi ξi KKT conditions Stationarity 2f (.) n i=1 αi − 2 n i=1 αi k(., xi ) = 0 ← The representer theorem 1 − n i=1 αi = 0 C − αi − βi = 0 Primal admissibility: k(xi , .) − f (.) 2 ≤ R2 + ξi , ξi ≥ 0 Dual admissibility: αi ≥ 0 , βi ≥ 0 Complementarity αi k(xi , .) − f (.) 2 − R2 − ξi = 0 βi ξi = 0
  • 23. SVDD in a RKHS: Dual and R2 L(α) = n i=1 αi k(xi , xi ) − 2 n i=1 f (xi ) + f 2 H with f (.) = n j=1 αj k(., xj ) = n i=1 αi k(xi , xi ) − n i=1 n j=1 αi αj k(xi , xj ) Gij Gij = k(xi , xj )    min α α Gα − α diag(G) with e α = 1 and 0 ≤ αi ≤ C, i = 1 . . . n As it is in the linear case: R2 = µ + f 2 H with µ denoting the Lagrange multiplier associated with the equality constraint n i=1 αi = 1.
  • 24. SVDD train and val in a RKHS Train using the dual form (in: G, C; out: α, µ)    min α α Gα − α diag(G) with e α = 1 and 0 ≤ αi ≤ C, i = 1 . . . n Val with the center in the RKHS: f (.) = n i=1 αi k(., xi ) φ(x) = k(x, .) − f (.) 2 H − R2 = k(x, .) 2 H − 2 k(x, .), f (.) H + f (.) 2 H − R2 = k(x, x) − 2f (x) + R2 − µ − R2 = −2f (x) + k(x, x) − µ = −2 n i=1 αi k(x, xi ) + k(x, x) − µ φ(x) = 0 is the decision border Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 20 / 35
  • 25. An important theoretical result For a well-calibrated bandwidth, The SVDD estimates the underlying distribution level set [Vert and Vert, 2006] The level sets of a probability density function IP(x) are the set Cp = {x ∈ IRd | IP(x) ≥ p} It is well estimated by the empirical minimum volume set Vp = {x ∈ IRd | k(x, .) − f (.) 2 H − R2 ≥ 0} The frontiers coincides
  • 26. SVDD: the generalization error For a well-calibrated bandwidth, (x1, . . . , xn) i.i.d. from some fixed but unknown IP(x) Then [Shawe-Taylor and Cristianini, 2004] with probability at least 1 − δ, (∀δ ∈]0, 1[), for any margin m > 0 IP k(x, .) − f (.) 2 H ≥ R2 + m ≤ 1 mn n i=1 ξi + 6R2 m √ n + 3 ln(2/δ) 2n
  • 27. Equivalence between SVDD and OCSVM for translation invariant kernels (diagonal constant kernels) Theorem Let H be a RKHS on some domain X endowed with kernel k. If there exists some constant c such that ∀x ∈ X, k(x, x) = c, then the two following problems are equivalent,    min f ,R,ξ R + C n i=1 ξi with k(xi , .) − f (.) 2 H ≤ R+ξi ξi ≥ 0 i = 1, n    min f ,ρ,ξ 1 2 f 2 H − ρ + C n i=1 εi with f (xi ) ≥ ρ − εi εi ≥ 0 i = 1, n with ρ = 1 2(c + f 2 H − R) and εi = 1 2ξi .
  • 28. Proof of the Equivalence between SVDD and OCSVM    min f ∈H,R∈IR,ξ∈IRn R + C n i=1 ξi with k(xi , .) − f (.) 2 H ≤ R+ξi , ξi ≥ 0 i = 1, n since k(xi , .) − f (.) 2 H = k(xi , xi ) + f 2 H − 2f (xi )    min f ∈H,R∈IR,ξ∈IRn R + C n i=1 ξi with 2f (xi ) ≥ k(xi , xi ) + f 2 H − R−ξi , ξi ≥ 0 i = 1, n. Introducing ρ = 1 2 (c + f 2 H − R) that is R = c + f 2 H − 2ρ, and since k(xi , xi ) is constant and equals to c the SVDD problem becomes    min f ∈H,ρ∈IR,ξ∈IRn 1 2 f 2 H − ρ + C 2 n i=1 ξi with f (xi ) ≥ ρ−1 2 ξi , ξi ≥ 0 i = 1, n
  • 29. leading to the classical one class SVM formulation (OCSVM)    min f ∈H,ρ∈IR,ξ∈IRn 1 2 f 2 H − ρ + C n i=1 εi with f (xi ) ≥ ρ − εi , εi ≥ 0 i = 1, n with εi = 1 2 ξi . Note that by putting ν = 1 nC we can get the so called ν formulation of the OCSVM    min f ∈H,ρ ∈IR,ξ ∈IRn 1 2 f 2 H − nνρ + n i=1 ξi with f (xi ) ≥ ρ − ξi , ξi ≥ 0 i = 1, n with f = Cf , ρ = Cρ, and ξ = Cξ.
  • 30. Duality Note that the dual of the SVDD is min α∈IRn α Gα − α g with n i=1 αi = 1 0 ≤ αi ≤ C i = 1, n where G is the kernel matrix of general term Gi,j = k(xi , xj ) and g the diagonal vector such that gi = k(xi , xi ) = c. The dual of the OCSVM is the following equivalent QP min α∈IRn 1 2α Gα with n i=1 αi = 1 0 ≤ αi ≤ C i = 1, n Both dual forms provide the same solution α, but not the same Lagrange multipliers. ρ is the Lagrange multiplier of the equality constraint of the dual of the OCSVM and R = c + α Gα − 2ρ. Using the SVDD dual, it turns out that R = λeq + α Gα where λeq is the Lagrange multiplier of the equality constraint of the SVDD dual form.
  • 31. Plan 1 Support Vector Data Description (SVDD) SVDD, the smallest enclosing ball problem The minimum enclosing ball problem with errors The minimum enclosing ball problem in a RKHS The two class Support vector data description (SVDD) Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 27 / 35
  • 32. The two class Support vector data description (SVDD) −4 −3 −2 −1 0 1 2 3 −3 −2 −1 0 1 2 3 4 −4 −3 −2 −1 0 1 2 3 −3 −2 −1 0 1 2 3 4 .    min c,R,ξ+,ξ− R2 +C yi =1 ξ+ i + yi =−1 ξ− i with xi − c 2 ≤ R2 +ξ+ i , ξ+ i ≥ 0 i such that yi = 1 and xi − c 2 ≥ R2 −ξ− i , ξ− i ≥ 0 i such that yi = −1 Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 28 / 35
  • 33. The two class SVDD as a QP    min c,R,ξ+,ξ− R2 +C yi =1 ξ+ i + yi =−1 ξ− i with xi − c 2 ≤ R2 +ξ+ i , ξ+ i ≥ 0 i such that yi = 1 and xi − c 2 ≥ R2 −ξ− i , ξ− i ≥ 0 i such that yi = −1 xi 2 − 2xi c + c 2 ≤ R2 +ξ+ i , ξ+ i ≥ 0 i such that yi = 1 xi 2 − 2xi c + c 2 ≥ R2 −ξ− i , ξ− i ≥ 0 i such that yi = −1 2xi c ≥ c 2 − R2 + xi 2 −ξ+ i , ξ+ i ≥ 0 i such that yi = 1 −2xi c ≥ − c 2 + R2 − xi 2 −ξ− i , ξ− i ≥ 0 i such that yi = −1 2yi xi c ≥ yi ( c 2 − R2 + xi 2 )−ξi , ξi ≥ 0 i = 1, n change variable: ρ = c 2 − R2    min c,ρ,ξ c 2 − ρ + C n i=1 ξi with 2yi xi c ≥ yi (ρ − xi 2 )−ξi i = 1, n and ξi ≥ 0 i = 1, n
  • 34. The dual of the two class SVDD Gij = yi yj xi xj The dual formulation:    min α∈IRn α Gα − n i=1 αi yi xi 2 with n i=1 yi αi = 1 0 ≤ αi ≤ C i = 1, n Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 30 / 35
  • 35. The two class SVDD vs. one class SVDD The two class SVDD (left) vs. the one class SVDD (right) Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 31 / 35
  • 36. Small Sphere and Large Margin (SSLM) approach Support vector data description with margin [Wu and Ye, 2009]    min w,R,ξ∈IRn R2 +C yi =1 ξ+ i + yi =−1 ξ− i with xi − c 2 ≤ R2 − 1+ξ+ i , ξ+ i ≥ 0 i such that yi = 1 and xi − c 2 ≥ R2 + 1−ξ− i , ξ− i ≥ 0 i such that yi = −1 xi − c 2 ≥ R2 + 1−ξ− i and yi = −1 ⇐⇒ yi xi − c 2 ≤ yi R2 − 1+ξ− i L(c, R, ξ, α, β) = R2 +C n i=1 ξi + n i=1 αi yi xi − c 2 − yi R2 + 1−ξi − n i=1 βi ξi −4 −3 −2 −1 0 1 2 3 −3 −2 −1 0 1 2 3 4
  • 37. SVDD with margin – dual formulation L(c, R, ξ, α, β) = R2 +C n i=1 ξi + n i=1 αi yi xi − c 2 − yi R2 + 1−ξi − n i=1 βi ξi Optimality: c = n i=1 αi yi xi ; n i=1 αi yi = 1 ; 0 ≤ αi ≤ C L(α) = n i=1 αi yi xi − n j=1 αi yj xj 2 + n i=1 αi = − n i=1 n j=1 αj αi yi yj xj xi + n i=1 xi 2 yi αi + n i=1 αi Dual SVDD is also a quadratic program problem D    min α∈IRn α Gα − e α − f α with y α = 1 and 0 ≤ αi ≤ C i = 1, n with G a symmetric matrix n × n such that Gij = yi yj xj xi and fi = xi 2 yi
  • 38. Conclusion Applications outlier detection change detection clustering large number of classes variable selection, . . . A clear path reformulation (to a standart problem) KKT Dual Bidual a lot of variations L2 SVDD two classes non symmetric two classes in the symmetric classes (SVM) the multi classes issue practical problems with translation invariant kernels .
  • 39. Bibliography Bo Liu, Yanshan Xiao, Longbing Cao, Zhifeng Hao, and Feiqi Deng. Svdd-based outlier detection on uncertain data. Knowledge and information systems, 34(3):597–618, 2013. B. Schölkopf and A. J. Smola. Learning with Kernels. MIT Press, 2002. John Shawe-Taylor and Nello Cristianini. Kernel methods for pattern analysis. Cambridge university press, 2004. David MJ Tax and Robert PW Duin. Support vector data description. Machine learning, 54(1):45–66, 2004. Régis Vert and Jean-Philippe Vert. Consistency and convergence rates of one-class svms and related algorithms. The Journal of Machine Learning Research, 7:817–854, 2006. Mingrui Wu and Jieping Ye. A small sphere and large margin approach for novelty detection using training data with outliers. Pattern Analysis and Machine Intelligence, IEEE Transactions on, 31(11):2088–2092, 2009. Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 35 / 35