SlideShare a Scribd company logo
1 of 39
Download to read offline
Lecture 6: Minimum encoding ball and Support vector
data description (SVDD)
Stéphane Canu
stephane.canu@litislab.eu
Sao Paulo 2014
May 12, 2014
Plan
1 Support Vector Data Description (SVDD)
SVDD, the smallest enclosing ball problem
The minimum enclosing ball problem with errors
The minimum enclosing ball problem in a RKHS
The two class Support vector data description (SVDD)
The minimum enclosing ball problem [Tax and Duin, 2004]
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 3 / 35
The minimum enclosing ball problem [Tax and Duin, 2004]
the center
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 3 / 35
The minimum enclosing ball problem [Tax and Duin, 2004]
the radius
Given n points, {xi , i = 1, n} .
min
R∈IR,c∈IRd
R2
with xi − c 2
≤ R2
, i = 1, . . . , n
What is that in the convex programming hierarchy?
LP, QP, QCQP, SOCP and SDP
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 3 / 35
The convex programming hierarchy (part of)
LP



min
x
f x
with Ax ≤ d
and 0 ≤ x
QP
min
x
1
2 x Gx + f x
with Ax ≤ d
QCQP



min
x
1
2 x Gx + f x
with x Bi x + ai x ≤ di
i = 1, n
SOCP



min
x
f x
with x − ai ≤ bi x + di
i = 1, n
The convex programming hierarchy?
Model generality: LP < QP < QCQP < SOCP < SDP
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 4 / 35
MEB as a QP in the primal
Theorem (MEB as a QP)
The two following problems are equivalent,
min
R∈IR,c∈IRd
R2
with xi − c 2
≤ R2
, i = 1, . . . , n
min
w,ρ
1
2 w 2
− ρ
with w xi ≥ ρ + 1
2 xi
2
with ρ = 1
2 ( c 2
− R2
) and w = c.
Proof:
xi − c 2
≤ R2
xi
2
− 2xi c + c 2
≤ R2
−2xi c ≤ R2
− xi
2
− c 2
2xi c ≥ −R2
+ xi
2
+ c 2
xi c ≥
1
2
( c 2
− R2
)
ρ
+1
2 xi
2
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 5 / 35
MEB and the one class SVM
SVDD:
min
w,ρ
1
2 w 2
− ρ
with w xi ≥ ρ + 1
2 xi
2
SVDD and linear OCSVM (Supporting Hyperplane)
if ∀i = 1, n, xi
2
= constant, it is the the linear one class SVM (OC SVM)
The linear one class SVM [Schölkopf and Smola, 2002]
min
w,ρ
1
2 w 2
− ρ
with w xi ≥ ρ
with ρ = ρ + 1
2 xi
2
⇒ OC SVM is a particular case of SVDD
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 6 / 35
When ∀i = 1, n, xi
2
= 1
0
c
xi − c 2
≤ R2
⇔ w xi ≥ ρ
with
ρ =
1
2
( c 2
− R + 1)
SVDD and OCSVM
"Belonging to the ball" is also "being above" an hyperplane
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 7 / 35
MEB: KKT
L(c, R, α) = R2
+
n
i=1
αi xi − c 2
− R2
KKT conditionns :
stationarty 2c
n
i=1
αi − 2
n
i=1
αi xi = 0 ← The representer theorem
1 −
n
i=1
αi = 0
primal admiss. xi − c 2
≤ R2
dual admiss. αi ≥ 0 i = 1, n
complementarity αi xi − c 2
− R2
= 0 i = 1, n
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 8 / 35
MEB: KKT
the radius
L(c, R, α) = R2
+
n
i=1
αi xi − c 2
− R2
KKT conditionns :
stationarty 2c
n
i=1
αi − 2
n
i=1
αi xi = 0 ← The representer theorem
1 −
n
i=1
αi = 0
primal admiss. xi − c 2
≤ R2
dual admiss. αi ≥ 0 i = 1, n
complementarity αi xi − c 2
− R2
= 0 i = 1, n
Complementarity tells us: two groups of points
the support vectors xi − c 2
= R2
and the insiders αi = 0
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 8 / 35
MEB: Dual
The representer theorem:
c =
n
i=1
αi xi
n
i=1
αi
=
n
i=1
αi xi
L(α) =
n
i=1
αi xi −
n
j=1
αj xj
2
n
i=1
n
j=1
αi αj xi xj = α Gα and
n
i=1
αi xi xi = α diag(G)
with G = XX the Gram matrix: Gij = xi xj ,



min
α∈IRn
α Gα − α diag(G)
with e α = 1
and 0 ≤ αi , i = 1 . . . n
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 9 / 35
SVDD primal vs. dual
Primal



min
R∈IR,c∈IRd
R2
with xi − c 2 ≤ R2,
i = 1, . . . , n
d + 1 unknown
n constraints
can be recast as a QP
perfect when d << n
Dual



min
α
α Gα − α diag(G)
with e α = 1
and 0 ≤ αi ,
i = 1 . . . n
n unknown with G the pairwise
influence Gram matrix
n box constraints
easy to solve
to be used when d > n
SVDD primal vs. dual
Primal



min
R∈IR,c∈IRd
R2
with xi − c 2 ≤ R2,
i = 1, . . . , n
d + 1 unknown
n constraints
can be recast as a QP
perfect when d << n
Dual



min
α
α Gα − α diag(G)
with e α = 1
and 0 ≤ αi ,
i = 1 . . . n
n unknown with G the pairwise
influence Gram matrix
n box constraints
easy to solve
to be used when d > n
But where is R2?
Looking for R2
min
α
α Gα − α diag(G)
with e α = 1, 0 ≤ αi , i = 1, n
The Lagrangian: L(α, µ, β) = α Gα − α diag(G) + µ(e α − 1) − β α
Stationarity cond.: αL(α, µ, β) = 2Gα − diag(G) + µe − β = 0
The bi dual
min
α
α Gα + µ
with −2Gα + diag(G) ≤ µe
by identification
R2
= µ + α Gα = µ + c 2
µ is the Lagrange multiplier associated with the equality constraint
n
i=1
αi = 1
Also, because of the complementarity condition, if xi is a support vector, then
βi = 0 implies αi > 0 and R2
= xi − c 2
.
Plan
1 Support Vector Data Description (SVDD)
SVDD, the smallest enclosing ball problem
The minimum enclosing ball problem with errors
The minimum enclosing ball problem in a RKHS
The two class Support vector data description (SVDD)
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 12 / 35
The minimum enclosing ball problem with errors
the slack
The same road map:
initial formuation
reformulation (as a QP)
Lagrangian, KKT
dual formulation
bi dual
Initial formulation: for a given C



min
R,a,ξ
R2
+ C
n
i=1
ξi
with xi − c 2
≤ R2
+ ξi , i = 1, . . . , n
and ξi ≥ 0, i = 1, . . . , n
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 13 / 35
The MEB with slack: QP, KKT, dual and R2
SVDD as a QP:



min
w,ρ
1
2 w 2
− ρ + C
2
n
i=1
ξi
with w xi ≥ ρ + 1
2 xi
2
− 1
2 ξi
and ξi ≥ 0,
i = 1, n
again with OC SVM as a particular case.
With G = XX
Dual SVDD:



min
α
α Gα − α diag(G)
with e α = 1
and 0 ≤ αi ≤ C,
i = 1, n
for a given C ≤ 1. If C is larger than one it is useless (it’s the no slack case)
R2
= µ + c c
with µ denoting the Lagrange multiplier associated with the equality
constraint n
i=1 αi = 1.
Variations over SVDD
Adaptive SVDD: the weighted error case for given wi , i = 1, n



min
c∈IRp,R∈IR,ξ∈IRn
R + C
n
i=1
wi ξi
with xi − c 2
≤ R+ξi
ξi ≥ 0 i = 1, n
The dual of this problem is a QP [see for instance Liu et al., 2013]
min
α∈IRn
α XX α − α diag(XX )
with
n
i=1 αi = 1 0 ≤ αi ≤ Cwi i = 1, n
Density induced SVDD (D-SVDD):



min
c∈IRp,R∈IR,ξ∈IRn
R + C
n
i=1
ξi
with wi xi − c 2
≤ R+ξi
ξi ≥ 0 i = 1, n
Plan
1 Support Vector Data Description (SVDD)
SVDD, the smallest enclosing ball problem
The minimum enclosing ball problem with errors
The minimum enclosing ball problem in a RKHS
The two class Support vector data description (SVDD)
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 16 / 35
SVDD in a RKHS
The feature map: IRp
−→ H
c −→ f (•)
xi −→ k(xi , •)
xi − c IRp ≤ R2
−→ k(xi , •) − f (•) 2
H ≤ R2
Kernelized SVDD (in a RKHS) is also a QP



min
f ∈H,R∈IR,ξ∈IRn
R2
+ C
n
i=1
ξi
with k(xi , •) − f (•) 2
H ≤ R2
+ξi i = 1, n
ξi ≥ 0 i = 1, n
SVDD in a RKHS: KKT, Dual and R2
L = R2
+ C
n
i=1
ξi +
n
i=1
αi k(xi , .) − f (.) 2
H − R2
−ξi −
n
i=1
βi ξi
= R2
+ C
n
i=1
ξi +
n
i=1
αi k(xi , xi ) − 2f (xi ) + f 2
H − R2
−ξi −
n
i=1
βi ξi
KKT conditions
Stationarity
2f (.)
n
i=1 αi − 2
n
i=1 αi k(., xi ) = 0 ← The representer theorem
1 −
n
i=1 αi = 0
C − αi − βi = 0
Primal admissibility: k(xi , .) − f (.) 2
≤ R2
+ ξi , ξi ≥ 0
Dual admissibility: αi ≥ 0 , βi ≥ 0
Complementarity
αi k(xi , .) − f (.) 2
− R2
− ξi = 0
βi ξi = 0
SVDD in a RKHS: Dual and R2
L(α) =
n
i=1
αi k(xi , xi ) − 2
n
i=1
f (xi ) + f 2
H with f (.) =
n
j=1
αj k(., xj )
=
n
i=1
αi k(xi , xi ) −
n
i=1
n
j=1
αi αj k(xi , xj )
Gij
Gij = k(xi , xj ) 


min
α
α Gα − α diag(G)
with e α = 1
and 0 ≤ αi ≤ C, i = 1 . . . n
As it is in the linear case:
R2
= µ + f 2
H
with µ denoting the Lagrange multiplier associated with the equality
constraint n
i=1 αi = 1.
SVDD train and val in a RKHS
Train using the dual form (in: G, C; out: α, µ)



min
α
α Gα − α diag(G)
with e α = 1
and 0 ≤ αi ≤ C, i = 1 . . . n
Val with the center in the RKHS: f (.) =
n
i=1 αi k(., xi )
φ(x) = k(x, .) − f (.) 2
H − R2
= k(x, .) 2
H − 2 k(x, .), f (.) H + f (.) 2
H − R2
= k(x, x) − 2f (x) + R2
− µ − R2
= −2f (x) + k(x, x) − µ
= −2
n
i=1
αi k(x, xi ) + k(x, x) − µ
φ(x) = 0 is the decision border
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 20 / 35
An important theoretical result
For a well-calibrated bandwidth,
The SVDD estimates the underlying distribution level set [Vert and Vert,
2006]
The level sets of a probability density function IP(x) are the set
Cp = {x ∈ IRd
| IP(x) ≥ p}
It is well estimated by the empirical minimum volume set
Vp = {x ∈ IRd
| k(x, .) − f (.) 2
H − R2
≥ 0}
The frontiers coincides
SVDD: the generalization error
For a well-calibrated bandwidth,
(x1, . . . , xn) i.i.d. from some fixed but unknown IP(x)
Then [Shawe-Taylor and Cristianini, 2004] with probability at least 1 − δ,
(∀δ ∈]0, 1[), for any margin m > 0
IP k(x, .) − f (.) 2
H ≥ R2
+ m ≤
1
mn
n
i=1
ξi +
6R2
m
√
n
+ 3
ln(2/δ)
2n
Equivalence between SVDD and OCSVM for translation
invariant kernels (diagonal constant kernels)
Theorem
Let H be a RKHS on some domain X endowed with kernel k. If there
exists some constant c such that ∀x ∈ X, k(x, x) = c, then the two
following problems are equivalent,



min
f ,R,ξ
R + C
n
i=1
ξi
with k(xi , .) − f (.) 2
H ≤ R+ξi
ξi ≥ 0 i = 1, n



min
f ,ρ,ξ
1
2 f 2
H − ρ + C
n
i=1
εi
with f (xi ) ≥ ρ − εi
εi ≥ 0 i = 1, n
with ρ = 1
2(c + f 2
H − R) and εi = 1
2ξi .
Proof of the Equivalence between SVDD and OCSVM



min
f ∈H,R∈IR,ξ∈IRn
R + C
n
i=1
ξi
with k(xi , .) − f (.) 2
H ≤ R+ξi , ξi ≥ 0 i = 1, n
since k(xi , .) − f (.) 2
H = k(xi , xi ) + f 2
H − 2f (xi )



min
f ∈H,R∈IR,ξ∈IRn
R + C
n
i=1
ξi
with 2f (xi ) ≥ k(xi , xi ) + f 2
H − R−ξi , ξi ≥ 0 i = 1, n.
Introducing ρ = 1
2 (c + f 2
H − R) that is R = c + f 2
H − 2ρ, and since k(xi , xi )
is constant and equals to c the SVDD problem becomes



min
f ∈H,ρ∈IR,ξ∈IRn
1
2 f 2
H − ρ + C
2
n
i=1
ξi
with f (xi ) ≥ ρ−1
2 ξi , ξi ≥ 0 i = 1, n
leading to the classical one class SVM formulation (OCSVM)



min
f ∈H,ρ∈IR,ξ∈IRn
1
2 f 2
H − ρ + C
n
i=1
εi
with f (xi ) ≥ ρ − εi , εi ≥ 0 i = 1, n
with εi = 1
2 ξi . Note that by putting ν = 1
nC we can get the so called ν
formulation of the OCSVM



min
f ∈H,ρ ∈IR,ξ ∈IRn
1
2 f 2
H − nνρ +
n
i=1
ξi
with f (xi ) ≥ ρ − ξi , ξi ≥ 0 i = 1, n
with f = Cf , ρ = Cρ, and ξ = Cξ.
Duality
Note that the dual of the SVDD is
min
α∈IRn
α Gα − α g
with n
i=1 αi = 1 0 ≤ αi ≤ C i = 1, n
where G is the kernel matrix of general term Gi,j = k(xi , xj ) and g the
diagonal vector such that gi = k(xi , xi ) = c. The dual of the OCSVM is
the following equivalent QP
min
α∈IRn
1
2α Gα
with n
i=1 αi = 1 0 ≤ αi ≤ C i = 1, n
Both dual forms provide the same solution α, but not the same Lagrange
multipliers. ρ is the Lagrange multiplier of the equality constraint of the
dual of the OCSVM and R = c + α Gα − 2ρ. Using the SVDD dual, it
turns out that R = λeq + α Gα where λeq is the Lagrange multiplier of
the equality constraint of the SVDD dual form.
Plan
1 Support Vector Data Description (SVDD)
SVDD, the smallest enclosing ball problem
The minimum enclosing ball problem with errors
The minimum enclosing ball problem in a RKHS
The two class Support vector data description (SVDD)
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 27 / 35
The two class Support vector data description (SVDD)
−4 −3 −2 −1 0 1 2 3
−3
−2
−1
0
1
2
3
4
−4 −3 −2 −1 0 1 2 3
−3
−2
−1
0
1
2
3
4
.



min
c,R,ξ+,ξ−
R2
+C
yi =1
ξ+
i +
yi =−1
ξ−
i
with xi − c 2
≤ R2
+ξ+
i , ξ+
i ≥ 0 i such that yi = 1
and xi − c 2
≥ R2
−ξ−
i , ξ−
i ≥ 0 i such that yi = −1
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 28 / 35
The two class SVDD as a QP



min
c,R,ξ+,ξ−
R2
+C
yi =1
ξ+
i +
yi =−1
ξ−
i
with xi − c 2
≤ R2
+ξ+
i , ξ+
i ≥ 0 i such that yi = 1
and xi − c 2
≥ R2
−ξ−
i , ξ−
i ≥ 0 i such that yi = −1
xi
2
− 2xi c + c 2
≤ R2
+ξ+
i , ξ+
i ≥ 0 i such that yi = 1
xi
2
− 2xi c + c 2
≥ R2
−ξ−
i , ξ−
i ≥ 0 i such that yi = −1
2xi c ≥ c 2
− R2
+ xi
2
−ξ+
i , ξ+
i ≥ 0 i such that yi = 1
−2xi c ≥ − c 2
+ R2
− xi
2
−ξ−
i , ξ−
i ≥ 0 i such that yi = −1
2yi xi c ≥ yi ( c 2
− R2
+ xi
2
)−ξi , ξi ≥ 0 i = 1, n
change variable: ρ = c 2 − R2



min
c,ρ,ξ
c 2
− ρ + C
n
i=1 ξi
with 2yi xi c ≥ yi (ρ − xi
2
)−ξi i = 1, n
and ξi ≥ 0 i = 1, n
The dual of the two class SVDD
Gij = yi yj xi xj
The dual formulation:



min
α∈IRn
α Gα −
n
i=1 αi yi xi
2
with
n
i=1
yi αi = 1
0 ≤ αi ≤ C i = 1, n
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 30 / 35
The two class SVDD vs. one class SVDD
The two class SVDD (left) vs. the one class SVDD (right)
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 31 / 35
Small Sphere and Large Margin (SSLM) approach
Support vector data description with margin [Wu and Ye, 2009]



min
w,R,ξ∈IRn
R2
+C
yi =1
ξ+
i +
yi =−1
ξ−
i
with xi − c 2
≤ R2
− 1+ξ+
i , ξ+
i ≥ 0 i such that yi = 1
and xi − c 2
≥ R2
+ 1−ξ−
i , ξ−
i ≥ 0 i such that yi = −1
xi − c 2
≥ R2
+ 1−ξ−
i and yi = −1 ⇐⇒ yi xi − c 2
≤ yi R2
− 1+ξ−
i
L(c, R, ξ, α, β) = R2
+C
n
i=1
ξi +
n
i=1
αi yi xi − c 2
− yi R2
+ 1−ξi −
n
i=1
βi ξi
−4 −3 −2 −1 0 1 2 3
−3
−2
−1
0
1
2
3
4
SVDD with margin – dual formulation
L(c, R, ξ, α, β) = R2
+C
n
i=1
ξi +
n
i=1
αi yi xi − c 2
− yi R2
+ 1−ξi −
n
i=1
βi ξi
Optimality: c =
n
i=1
αi yi xi ;
n
i=1
αi yi = 1 ; 0 ≤ αi ≤ C
L(α) =
n
i=1
αi yi xi −
n
j=1
αi yj xj
2
+
n
i=1
αi
= −
n
i=1
n
j=1
αj αi yi yj xj xi +
n
i=1
xi
2
yi αi +
n
i=1
αi
Dual SVDD is also a quadratic program
problem D



min
α∈IRn
α Gα − e α − f α
with y α = 1
and 0 ≤ αi ≤ C i = 1, n
with G a symmetric matrix n × n such that Gij = yi yj xj xi and fi = xi
2
yi
Conclusion
Applications
outlier detection
change detection
clustering
large number of classes
variable selection, . . .
A clear path
reformulation (to a standart problem)
KKT
Dual
Bidual
a lot of variations
L2
SVDD
two classes non symmetric
two classes in the symmetric classes (SVM)
the multi classes issue
practical problems with translation invariant
kernels
.
Bibliography
Bo Liu, Yanshan Xiao, Longbing Cao, Zhifeng Hao, and Feiqi Deng.
Svdd-based outlier detection on uncertain data. Knowledge and
information systems, 34(3):597–618, 2013.
B. Schölkopf and A. J. Smola. Learning with Kernels. MIT Press, 2002.
John Shawe-Taylor and Nello Cristianini. Kernel methods for pattern
analysis. Cambridge university press, 2004.
David MJ Tax and Robert PW Duin. Support vector data description.
Machine learning, 54(1):45–66, 2004.
Régis Vert and Jean-Philippe Vert. Consistency and convergence rates of
one-class svms and related algorithms. The Journal of Machine Learning
Research, 7:817–854, 2006.
Mingrui Wu and Jieping Ye. A small sphere and large margin approach for
novelty detection using training data with outliers. Pattern Analysis and
Machine Intelligence, IEEE Transactions on, 31(11):2088–2092, 2009.
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 35 / 35

More Related Content

What's hot

【暗号通貨輪読会#14】confidential transaction
【暗号通貨輪読会#14】confidential transaction【暗号通貨輪読会#14】confidential transaction
【暗号通貨輪読会#14】confidential transactionshigeyuki azuchi
 
katagaitai CTF勉強会 #5 Crypto
katagaitai CTF勉強会 #5 Cryptokatagaitai CTF勉強会 #5 Crypto
katagaitai CTF勉強会 #5 Cryptotrmr
 
RSA暗号運用でやってはいけない n のこと #ssmjp
RSA暗号運用でやってはいけない n のこと #ssmjpRSA暗号運用でやってはいけない n のこと #ssmjp
RSA暗号運用でやってはいけない n のこと #ssmjpsonickun
 
Scala 初心者が Hom 函手を Scala で考えてみた
Scala 初心者が Hom 函手を Scala で考えてみたScala 初心者が Hom 函手を Scala で考えてみた
Scala 初心者が Hom 函手を Scala で考えてみたKazuyuki TAKASE
 
自作ペアリング/BLS署名ライブラリの紹介
自作ペアリング/BLS署名ライブラリの紹介自作ペアリング/BLS署名ライブラリの紹介
自作ペアリング/BLS署名ライブラリの紹介MITSUNARI Shigeo
 
素数の分解法則(フロベニウスやばい) #math_cafe
素数の分解法則(フロベニウスやばい) #math_cafe 素数の分解法則(フロベニウスやばい) #math_cafe
素数の分解法則(フロベニウスやばい) #math_cafe Junpei Tsuji
 
勉強か?趣味か?人生か?―プログラミングコンテストとは
勉強か?趣味か?人生か?―プログラミングコンテストとは勉強か?趣味か?人生か?―プログラミングコンテストとは
勉強か?趣味か?人生か?―プログラミングコンテストとはTakuya Akiba
 
Lifted-ElGamal暗号を用いた任意関数演算の二者間秘密計算プロトコルのmaliciousモデルにおける効率化
Lifted-ElGamal暗号を用いた任意関数演算の二者間秘密計算プロトコルのmaliciousモデルにおける効率化Lifted-ElGamal暗号を用いた任意関数演算の二者間秘密計算プロトコルのmaliciousモデルにおける効率化
Lifted-ElGamal暗号を用いた任意関数演算の二者間秘密計算プロトコルのmaliciousモデルにおける効率化MITSUNARI Shigeo
 
はじめてのパターン認識 第1章
はじめてのパターン認識 第1章はじめてのパターン認識 第1章
はじめてのパターン認識 第1章Prunus 1350
 
スペクトラルグラフ理論入門
スペクトラルグラフ理論入門スペクトラルグラフ理論入門
スペクトラルグラフ理論入門irrrrr
 
katagaitai CTF勉強会 #3 crypto
katagaitai CTF勉強会 #3 cryptokatagaitai CTF勉強会 #3 crypto
katagaitai CTF勉強会 #3 cryptotrmr
 
RISC-Vのセキュリティ技術(TEE, Root of Trust, Remote Attestation)
RISC-Vのセキュリティ技術(TEE, Root of Trust, Remote Attestation)RISC-Vのセキュリティ技術(TEE, Root of Trust, Remote Attestation)
RISC-Vのセキュリティ技術(TEE, Root of Trust, Remote Attestation)Kuniyasu Suzaki
 
最適化超入門
最適化超入門最適化超入門
最適化超入門Takami Sato
 
機械学習を用いた異常検知入門
機械学習を用いた異常検知入門機械学習を用いた異常検知入門
機械学習を用いた異常検知入門michiaki ito
 
高速フーリエ変換
高速フーリエ変換高速フーリエ変換
高速フーリエ変換AtCoder Inc.
 
最新C++事情 C++14-C++20 (2018年10月)
最新C++事情 C++14-C++20 (2018年10月)最新C++事情 C++14-C++20 (2018年10月)
最新C++事情 C++14-C++20 (2018年10月)Akihiko Matuura
 
数理最適化とPython
数理最適化とPython数理最適化とPython
数理最適化とPythonYosuke Onoue
 

What's hot (20)

【暗号通貨輪読会#14】confidential transaction
【暗号通貨輪読会#14】confidential transaction【暗号通貨輪読会#14】confidential transaction
【暗号通貨輪読会#14】confidential transaction
 
katagaitai CTF勉強会 #5 Crypto
katagaitai CTF勉強会 #5 Cryptokatagaitai CTF勉強会 #5 Crypto
katagaitai CTF勉強会 #5 Crypto
 
RSA暗号運用でやってはいけない n のこと #ssmjp
RSA暗号運用でやってはいけない n のこと #ssmjpRSA暗号運用でやってはいけない n のこと #ssmjp
RSA暗号運用でやってはいけない n のこと #ssmjp
 
Scala 初心者が Hom 函手を Scala で考えてみた
Scala 初心者が Hom 函手を Scala で考えてみたScala 初心者が Hom 函手を Scala で考えてみた
Scala 初心者が Hom 函手を Scala で考えてみた
 
自作ペアリング/BLS署名ライブラリの紹介
自作ペアリング/BLS署名ライブラリの紹介自作ペアリング/BLS署名ライブラリの紹介
自作ペアリング/BLS署名ライブラリの紹介
 
素数の分解法則(フロベニウスやばい) #math_cafe
素数の分解法則(フロベニウスやばい) #math_cafe 素数の分解法則(フロベニウスやばい) #math_cafe
素数の分解法則(フロベニウスやばい) #math_cafe
 
勉強か?趣味か?人生か?―プログラミングコンテストとは
勉強か?趣味か?人生か?―プログラミングコンテストとは勉強か?趣味か?人生か?―プログラミングコンテストとは
勉強か?趣味か?人生か?―プログラミングコンテストとは
 
Lifted-ElGamal暗号を用いた任意関数演算の二者間秘密計算プロトコルのmaliciousモデルにおける効率化
Lifted-ElGamal暗号を用いた任意関数演算の二者間秘密計算プロトコルのmaliciousモデルにおける効率化Lifted-ElGamal暗号を用いた任意関数演算の二者間秘密計算プロトコルのmaliciousモデルにおける効率化
Lifted-ElGamal暗号を用いた任意関数演算の二者間秘密計算プロトコルのmaliciousモデルにおける効率化
 
はじめてのパターン認識 第1章
はじめてのパターン認識 第1章はじめてのパターン認識 第1章
はじめてのパターン認識 第1章
 
スペクトラルグラフ理論入門
スペクトラルグラフ理論入門スペクトラルグラフ理論入門
スペクトラルグラフ理論入門
 
katagaitai CTF勉強会 #3 crypto
katagaitai CTF勉強会 #3 cryptokatagaitai CTF勉強会 #3 crypto
katagaitai CTF勉強会 #3 crypto
 
RISC-Vのセキュリティ技術(TEE, Root of Trust, Remote Attestation)
RISC-Vのセキュリティ技術(TEE, Root of Trust, Remote Attestation)RISC-Vのセキュリティ技術(TEE, Root of Trust, Remote Attestation)
RISC-Vのセキュリティ技術(TEE, Root of Trust, Remote Attestation)
 
最適化超入門
最適化超入門最適化超入門
最適化超入門
 
機械学習を用いた異常検知入門
機械学習を用いた異常検知入門機械学習を用いた異常検知入門
機械学習を用いた異常検知入門
 
新しい暗号技術
新しい暗号技術新しい暗号技術
新しい暗号技術
 
高速フーリエ変換
高速フーリエ変換高速フーリエ変換
高速フーリエ変換
 
暗号技術入門
暗号技術入門暗号技術入門
暗号技術入門
 
直交領域探索
直交領域探索直交領域探索
直交領域探索
 
最新C++事情 C++14-C++20 (2018年10月)
最新C++事情 C++14-C++20 (2018年10月)最新C++事情 C++14-C++20 (2018年10月)
最新C++事情 C++14-C++20 (2018年10月)
 
数理最適化とPython
数理最適化とPython数理最適化とPython
数理最適化とPython
 

Viewers also liked

Lecture10 outilier l0_svdd
Lecture10 outilier l0_svddLecture10 outilier l0_svdd
Lecture10 outilier l0_svddStéphane Canu
 
Support Vector Machines for Classification
Support Vector Machines for ClassificationSupport Vector Machines for Classification
Support Vector Machines for ClassificationPrakash Pimpale
 
Anomaly Detection using SIngle Class SVM with Gaussian Kernel
Anomaly Detection using SIngle Class SVM with Gaussian KernelAnomaly Detection using SIngle Class SVM with Gaussian Kernel
Anomaly Detection using SIngle Class SVM with Gaussian KernelAnoop Vasant Kumar
 
ODSC - Neural Networks on AWS Lambda
ODSC - Neural Networks on AWS LambdaODSC - Neural Networks on AWS Lambda
ODSC - Neural Networks on AWS LambdaAnoop Vasant Kumar
 

Viewers also liked (6)

Lecture10 outilier l0_svdd
Lecture10 outilier l0_svddLecture10 outilier l0_svdd
Lecture10 outilier l0_svdd
 
Support Vector Machines for Classification
Support Vector Machines for ClassificationSupport Vector Machines for Classification
Support Vector Machines for Classification
 
thesis
thesisthesis
thesis
 
Anomaly Detection using SIngle Class SVM with Gaussian Kernel
Anomaly Detection using SIngle Class SVM with Gaussian KernelAnomaly Detection using SIngle Class SVM with Gaussian Kernel
Anomaly Detection using SIngle Class SVM with Gaussian Kernel
 
ODSC - Neural Networks on AWS Lambda
ODSC - Neural Networks on AWS LambdaODSC - Neural Networks on AWS Lambda
ODSC - Neural Networks on AWS Lambda
 
Human values & professional ethics
Human values & professional ethicsHuman values & professional ethics
Human values & professional ethics
 

Similar to Lecture6 svdd

Trilinear embedding for divergence-form operators
Trilinear embedding for divergence-form operatorsTrilinear embedding for divergence-form operators
Trilinear embedding for divergence-form operatorsVjekoslavKovac1
 
Bayesian inference on mixtures
Bayesian inference on mixturesBayesian inference on mixtures
Bayesian inference on mixturesChristian Robert
 
Positive and negative solutions of a boundary value problem for a fractional ...
Positive and negative solutions of a boundary value problem for a fractional ...Positive and negative solutions of a boundary value problem for a fractional ...
Positive and negative solutions of a boundary value problem for a fractional ...journal ijrtem
 
A Szemeredi-type theorem for subsets of the unit cube
A Szemeredi-type theorem for subsets of the unit cubeA Szemeredi-type theorem for subsets of the unit cube
A Szemeredi-type theorem for subsets of the unit cubeVjekoslavKovac1
 
Solutions Manual for Calculus Early Transcendentals 10th Edition by Anton
Solutions Manual for Calculus Early Transcendentals 10th Edition by AntonSolutions Manual for Calculus Early Transcendentals 10th Edition by Anton
Solutions Manual for Calculus Early Transcendentals 10th Edition by AntonPamelaew
 
Calculus Early Transcendentals 10th Edition Anton Solutions Manual
Calculus Early Transcendentals 10th Edition Anton Solutions ManualCalculus Early Transcendentals 10th Edition Anton Solutions Manual
Calculus Early Transcendentals 10th Edition Anton Solutions Manualnodyligomi
 
Practical and Worst-Case Efficient Apportionment
Practical and Worst-Case Efficient ApportionmentPractical and Worst-Case Efficient Apportionment
Practical and Worst-Case Efficient ApportionmentRaphael Reitzig
 
Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...Alexander Litvinenko
 
QMC Error SAMSI Tutorial Aug 2017
QMC Error SAMSI Tutorial Aug 2017QMC Error SAMSI Tutorial Aug 2017
QMC Error SAMSI Tutorial Aug 2017Fred J. Hickernell
 
A Proof of the Generalized Riemann Hypothesis
A Proof of the Generalized Riemann HypothesisA Proof of the Generalized Riemann Hypothesis
A Proof of the Generalized Riemann HypothesisCharaf Ech-Chatbi
 
A Proof of the Generalized Riemann Hypothesis
A Proof of the Generalized Riemann HypothesisA Proof of the Generalized Riemann Hypothesis
A Proof of the Generalized Riemann HypothesisCharaf Ech-Chatbi
 
Classification with mixtures of curved Mahalanobis metrics
Classification with mixtures of curved Mahalanobis metricsClassification with mixtures of curved Mahalanobis metrics
Classification with mixtures of curved Mahalanobis metricsFrank Nielsen
 
Nonconvex Compressed Sensing with the Sum-of-Squares Method
Nonconvex Compressed Sensing with the Sum-of-Squares MethodNonconvex Compressed Sensing with the Sum-of-Squares Method
Nonconvex Compressed Sensing with the Sum-of-Squares MethodTasuku Soma
 
IIT Jam math 2016 solutions BY Trajectoryeducation
IIT Jam math 2016 solutions BY TrajectoryeducationIIT Jam math 2016 solutions BY Trajectoryeducation
IIT Jam math 2016 solutions BY TrajectoryeducationDev Singh
 
Doubly Accelerated Stochastic Variance Reduced Gradient Methods for Regulariz...
Doubly Accelerated Stochastic Variance Reduced Gradient Methods for Regulariz...Doubly Accelerated Stochastic Variance Reduced Gradient Methods for Regulariz...
Doubly Accelerated Stochastic Variance Reduced Gradient Methods for Regulariz...Tomoya Murata
 
Lecture3 linear svm_with_slack
Lecture3 linear svm_with_slackLecture3 linear svm_with_slack
Lecture3 linear svm_with_slackStéphane Canu
 
Some Examples of Scaling Sets
Some Examples of Scaling SetsSome Examples of Scaling Sets
Some Examples of Scaling SetsVjekoslavKovac1
 

Similar to Lecture6 svdd (20)

Trilinear embedding for divergence-form operators
Trilinear embedding for divergence-form operatorsTrilinear embedding for divergence-form operators
Trilinear embedding for divergence-form operators
 
Bayesian inference on mixtures
Bayesian inference on mixturesBayesian inference on mixtures
Bayesian inference on mixtures
 
Positive and negative solutions of a boundary value problem for a fractional ...
Positive and negative solutions of a boundary value problem for a fractional ...Positive and negative solutions of a boundary value problem for a fractional ...
Positive and negative solutions of a boundary value problem for a fractional ...
 
A Szemeredi-type theorem for subsets of the unit cube
A Szemeredi-type theorem for subsets of the unit cubeA Szemeredi-type theorem for subsets of the unit cube
A Szemeredi-type theorem for subsets of the unit cube
 
Solutions Manual for Calculus Early Transcendentals 10th Edition by Anton
Solutions Manual for Calculus Early Transcendentals 10th Edition by AntonSolutions Manual for Calculus Early Transcendentals 10th Edition by Anton
Solutions Manual for Calculus Early Transcendentals 10th Edition by Anton
 
Calculus Early Transcendentals 10th Edition Anton Solutions Manual
Calculus Early Transcendentals 10th Edition Anton Solutions ManualCalculus Early Transcendentals 10th Edition Anton Solutions Manual
Calculus Early Transcendentals 10th Edition Anton Solutions Manual
 
Practical and Worst-Case Efficient Apportionment
Practical and Worst-Case Efficient ApportionmentPractical and Worst-Case Efficient Apportionment
Practical and Worst-Case Efficient Apportionment
 
Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...
 
QMC Error SAMSI Tutorial Aug 2017
QMC Error SAMSI Tutorial Aug 2017QMC Error SAMSI Tutorial Aug 2017
QMC Error SAMSI Tutorial Aug 2017
 
A Proof of the Generalized Riemann Hypothesis
A Proof of the Generalized Riemann HypothesisA Proof of the Generalized Riemann Hypothesis
A Proof of the Generalized Riemann Hypothesis
 
A Proof of the Generalized Riemann Hypothesis
A Proof of the Generalized Riemann HypothesisA Proof of the Generalized Riemann Hypothesis
A Proof of the Generalized Riemann Hypothesis
 
Classification with mixtures of curved Mahalanobis metrics
Classification with mixtures of curved Mahalanobis metricsClassification with mixtures of curved Mahalanobis metrics
Classification with mixtures of curved Mahalanobis metrics
 
Nonconvex Compressed Sensing with the Sum-of-Squares Method
Nonconvex Compressed Sensing with the Sum-of-Squares MethodNonconvex Compressed Sensing with the Sum-of-Squares Method
Nonconvex Compressed Sensing with the Sum-of-Squares Method
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
IIT Jam math 2016 solutions BY Trajectoryeducation
IIT Jam math 2016 solutions BY TrajectoryeducationIIT Jam math 2016 solutions BY Trajectoryeducation
IIT Jam math 2016 solutions BY Trajectoryeducation
 
Doubly Accelerated Stochastic Variance Reduced Gradient Methods for Regulariz...
Doubly Accelerated Stochastic Variance Reduced Gradient Methods for Regulariz...Doubly Accelerated Stochastic Variance Reduced Gradient Methods for Regulariz...
Doubly Accelerated Stochastic Variance Reduced Gradient Methods for Regulariz...
 
Shell theory
Shell theoryShell theory
Shell theory
 
Maths04
Maths04Maths04
Maths04
 
Lecture3 linear svm_with_slack
Lecture3 linear svm_with_slackLecture3 linear svm_with_slack
Lecture3 linear svm_with_slack
 
Some Examples of Scaling Sets
Some Examples of Scaling SetsSome Examples of Scaling Sets
Some Examples of Scaling Sets
 

More from Stéphane Canu

Lecture 2: linear SVM in the Dual
Lecture 2: linear SVM in the DualLecture 2: linear SVM in the Dual
Lecture 2: linear SVM in the DualStéphane Canu
 
Lecture8 multi class_svm
Lecture8 multi class_svmLecture8 multi class_svm
Lecture8 multi class_svmStéphane Canu
 
Lecture7 cross validation
Lecture7 cross validationLecture7 cross validation
Lecture7 cross validationStéphane Canu
 
Lecture4 kenrels functions_rkhs
Lecture4 kenrels functions_rkhsLecture4 kenrels functions_rkhs
Lecture4 kenrels functions_rkhsStéphane Canu
 
Lecture 2: linear SVM in the dual
Lecture 2: linear SVM in the dualLecture 2: linear SVM in the dual
Lecture 2: linear SVM in the dualStéphane Canu
 
Lecture 1: linear SVM in the primal
Lecture 1: linear SVM in the primalLecture 1: linear SVM in the primal
Lecture 1: linear SVM in the primalStéphane Canu
 
Lecture9 multi kernel_svm
Lecture9 multi kernel_svmLecture9 multi kernel_svm
Lecture9 multi kernel_svmStéphane Canu
 
Main recsys factorisation
Main recsys factorisationMain recsys factorisation
Main recsys factorisationStéphane Canu
 

More from Stéphane Canu (9)

Lecture 2: linear SVM in the Dual
Lecture 2: linear SVM in the DualLecture 2: linear SVM in the Dual
Lecture 2: linear SVM in the Dual
 
Lecture8 multi class_svm
Lecture8 multi class_svmLecture8 multi class_svm
Lecture8 multi class_svm
 
Lecture7 cross validation
Lecture7 cross validationLecture7 cross validation
Lecture7 cross validation
 
Lecture5 kernel svm
Lecture5 kernel svmLecture5 kernel svm
Lecture5 kernel svm
 
Lecture4 kenrels functions_rkhs
Lecture4 kenrels functions_rkhsLecture4 kenrels functions_rkhs
Lecture4 kenrels functions_rkhs
 
Lecture 2: linear SVM in the dual
Lecture 2: linear SVM in the dualLecture 2: linear SVM in the dual
Lecture 2: linear SVM in the dual
 
Lecture 1: linear SVM in the primal
Lecture 1: linear SVM in the primalLecture 1: linear SVM in the primal
Lecture 1: linear SVM in the primal
 
Lecture9 multi kernel_svm
Lecture9 multi kernel_svmLecture9 multi kernel_svm
Lecture9 multi kernel_svm
 
Main recsys factorisation
Main recsys factorisationMain recsys factorisation
Main recsys factorisation
 

Recently uploaded

Snow Chain-Integrated Tire for a Safe Drive on Winter Roads
Snow Chain-Integrated Tire for a Safe Drive on Winter RoadsSnow Chain-Integrated Tire for a Safe Drive on Winter Roads
Snow Chain-Integrated Tire for a Safe Drive on Winter RoadsHyundai Motor Group
 
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024BookNet Canada
 
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersEnhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersThousandEyes
 
costume and set research powerpoint presentation
costume and set research powerpoint presentationcostume and set research powerpoint presentation
costume and set research powerpoint presentationphoebematthew05
 
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationBeyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationSafe Software
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationRidwan Fadjar
 
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...shyamraj55
 
My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024The Digital Insurer
 
Install Stable Diffusion in windows machine
Install Stable Diffusion in windows machineInstall Stable Diffusion in windows machine
Install Stable Diffusion in windows machinePadma Pradeep
 
Understanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitectureUnderstanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitecturePixlogix Infotech
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsRizwan Syed
 
Build your next Gen AI Breakthrough - April 2024
Build your next Gen AI Breakthrough - April 2024Build your next Gen AI Breakthrough - April 2024
Build your next Gen AI Breakthrough - April 2024Neo4j
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsMark Billinghurst
 
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticscarlostorres15106
 
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024BookNet Canada
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesSinan KOZAK
 
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024BookNet Canada
 

Recently uploaded (20)

Vulnerability_Management_GRC_by Sohang Sengupta.pptx
Vulnerability_Management_GRC_by Sohang Sengupta.pptxVulnerability_Management_GRC_by Sohang Sengupta.pptx
Vulnerability_Management_GRC_by Sohang Sengupta.pptx
 
Snow Chain-Integrated Tire for a Safe Drive on Winter Roads
Snow Chain-Integrated Tire for a Safe Drive on Winter RoadsSnow Chain-Integrated Tire for a Safe Drive on Winter Roads
Snow Chain-Integrated Tire for a Safe Drive on Winter Roads
 
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
 
DMCC Future of Trade Web3 - Special Edition
DMCC Future of Trade Web3 - Special EditionDMCC Future of Trade Web3 - Special Edition
DMCC Future of Trade Web3 - Special Edition
 
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersEnhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
 
costume and set research powerpoint presentation
costume and set research powerpoint presentationcostume and set research powerpoint presentation
costume and set research powerpoint presentation
 
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationBeyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 Presentation
 
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
 
My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024
 
Install Stable Diffusion in windows machine
Install Stable Diffusion in windows machineInstall Stable Diffusion in windows machine
Install Stable Diffusion in windows machine
 
Understanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitectureUnderstanding the Laravel MVC Architecture
Understanding the Laravel MVC Architecture
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL Certs
 
Build your next Gen AI Breakthrough - April 2024
Build your next Gen AI Breakthrough - April 2024Build your next Gen AI Breakthrough - April 2024
Build your next Gen AI Breakthrough - April 2024
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
 
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptxE-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
 
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
 
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen Frames
 
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
 

Lecture6 svdd

  • 1. Lecture 6: Minimum encoding ball and Support vector data description (SVDD) Stéphane Canu stephane.canu@litislab.eu Sao Paulo 2014 May 12, 2014
  • 2. Plan 1 Support Vector Data Description (SVDD) SVDD, the smallest enclosing ball problem The minimum enclosing ball problem with errors The minimum enclosing ball problem in a RKHS The two class Support vector data description (SVDD)
  • 3. The minimum enclosing ball problem [Tax and Duin, 2004] Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 3 / 35
  • 4. The minimum enclosing ball problem [Tax and Duin, 2004] the center Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 3 / 35
  • 5. The minimum enclosing ball problem [Tax and Duin, 2004] the radius Given n points, {xi , i = 1, n} . min R∈IR,c∈IRd R2 with xi − c 2 ≤ R2 , i = 1, . . . , n What is that in the convex programming hierarchy? LP, QP, QCQP, SOCP and SDP Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 3 / 35
  • 6. The convex programming hierarchy (part of) LP    min x f x with Ax ≤ d and 0 ≤ x QP min x 1 2 x Gx + f x with Ax ≤ d QCQP    min x 1 2 x Gx + f x with x Bi x + ai x ≤ di i = 1, n SOCP    min x f x with x − ai ≤ bi x + di i = 1, n The convex programming hierarchy? Model generality: LP < QP < QCQP < SOCP < SDP Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 4 / 35
  • 7. MEB as a QP in the primal Theorem (MEB as a QP) The two following problems are equivalent, min R∈IR,c∈IRd R2 with xi − c 2 ≤ R2 , i = 1, . . . , n min w,ρ 1 2 w 2 − ρ with w xi ≥ ρ + 1 2 xi 2 with ρ = 1 2 ( c 2 − R2 ) and w = c. Proof: xi − c 2 ≤ R2 xi 2 − 2xi c + c 2 ≤ R2 −2xi c ≤ R2 − xi 2 − c 2 2xi c ≥ −R2 + xi 2 + c 2 xi c ≥ 1 2 ( c 2 − R2 ) ρ +1 2 xi 2 Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 5 / 35
  • 8. MEB and the one class SVM SVDD: min w,ρ 1 2 w 2 − ρ with w xi ≥ ρ + 1 2 xi 2 SVDD and linear OCSVM (Supporting Hyperplane) if ∀i = 1, n, xi 2 = constant, it is the the linear one class SVM (OC SVM) The linear one class SVM [Schölkopf and Smola, 2002] min w,ρ 1 2 w 2 − ρ with w xi ≥ ρ with ρ = ρ + 1 2 xi 2 ⇒ OC SVM is a particular case of SVDD Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 6 / 35
  • 9. When ∀i = 1, n, xi 2 = 1 0 c xi − c 2 ≤ R2 ⇔ w xi ≥ ρ with ρ = 1 2 ( c 2 − R + 1) SVDD and OCSVM "Belonging to the ball" is also "being above" an hyperplane Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 7 / 35
  • 10. MEB: KKT L(c, R, α) = R2 + n i=1 αi xi − c 2 − R2 KKT conditionns : stationarty 2c n i=1 αi − 2 n i=1 αi xi = 0 ← The representer theorem 1 − n i=1 αi = 0 primal admiss. xi − c 2 ≤ R2 dual admiss. αi ≥ 0 i = 1, n complementarity αi xi − c 2 − R2 = 0 i = 1, n Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 8 / 35
  • 11. MEB: KKT the radius L(c, R, α) = R2 + n i=1 αi xi − c 2 − R2 KKT conditionns : stationarty 2c n i=1 αi − 2 n i=1 αi xi = 0 ← The representer theorem 1 − n i=1 αi = 0 primal admiss. xi − c 2 ≤ R2 dual admiss. αi ≥ 0 i = 1, n complementarity αi xi − c 2 − R2 = 0 i = 1, n Complementarity tells us: two groups of points the support vectors xi − c 2 = R2 and the insiders αi = 0 Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 8 / 35
  • 12. MEB: Dual The representer theorem: c = n i=1 αi xi n i=1 αi = n i=1 αi xi L(α) = n i=1 αi xi − n j=1 αj xj 2 n i=1 n j=1 αi αj xi xj = α Gα and n i=1 αi xi xi = α diag(G) with G = XX the Gram matrix: Gij = xi xj ,    min α∈IRn α Gα − α diag(G) with e α = 1 and 0 ≤ αi , i = 1 . . . n Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 9 / 35
  • 13. SVDD primal vs. dual Primal    min R∈IR,c∈IRd R2 with xi − c 2 ≤ R2, i = 1, . . . , n d + 1 unknown n constraints can be recast as a QP perfect when d << n Dual    min α α Gα − α diag(G) with e α = 1 and 0 ≤ αi , i = 1 . . . n n unknown with G the pairwise influence Gram matrix n box constraints easy to solve to be used when d > n
  • 14. SVDD primal vs. dual Primal    min R∈IR,c∈IRd R2 with xi − c 2 ≤ R2, i = 1, . . . , n d + 1 unknown n constraints can be recast as a QP perfect when d << n Dual    min α α Gα − α diag(G) with e α = 1 and 0 ≤ αi , i = 1 . . . n n unknown with G the pairwise influence Gram matrix n box constraints easy to solve to be used when d > n But where is R2?
  • 15. Looking for R2 min α α Gα − α diag(G) with e α = 1, 0 ≤ αi , i = 1, n The Lagrangian: L(α, µ, β) = α Gα − α diag(G) + µ(e α − 1) − β α Stationarity cond.: αL(α, µ, β) = 2Gα − diag(G) + µe − β = 0 The bi dual min α α Gα + µ with −2Gα + diag(G) ≤ µe by identification R2 = µ + α Gα = µ + c 2 µ is the Lagrange multiplier associated with the equality constraint n i=1 αi = 1 Also, because of the complementarity condition, if xi is a support vector, then βi = 0 implies αi > 0 and R2 = xi − c 2 .
  • 16. Plan 1 Support Vector Data Description (SVDD) SVDD, the smallest enclosing ball problem The minimum enclosing ball problem with errors The minimum enclosing ball problem in a RKHS The two class Support vector data description (SVDD) Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 12 / 35
  • 17. The minimum enclosing ball problem with errors the slack The same road map: initial formuation reformulation (as a QP) Lagrangian, KKT dual formulation bi dual Initial formulation: for a given C    min R,a,ξ R2 + C n i=1 ξi with xi − c 2 ≤ R2 + ξi , i = 1, . . . , n and ξi ≥ 0, i = 1, . . . , n Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 13 / 35
  • 18. The MEB with slack: QP, KKT, dual and R2 SVDD as a QP:    min w,ρ 1 2 w 2 − ρ + C 2 n i=1 ξi with w xi ≥ ρ + 1 2 xi 2 − 1 2 ξi and ξi ≥ 0, i = 1, n again with OC SVM as a particular case. With G = XX Dual SVDD:    min α α Gα − α diag(G) with e α = 1 and 0 ≤ αi ≤ C, i = 1, n for a given C ≤ 1. If C is larger than one it is useless (it’s the no slack case) R2 = µ + c c with µ denoting the Lagrange multiplier associated with the equality constraint n i=1 αi = 1.
  • 19. Variations over SVDD Adaptive SVDD: the weighted error case for given wi , i = 1, n    min c∈IRp,R∈IR,ξ∈IRn R + C n i=1 wi ξi with xi − c 2 ≤ R+ξi ξi ≥ 0 i = 1, n The dual of this problem is a QP [see for instance Liu et al., 2013] min α∈IRn α XX α − α diag(XX ) with n i=1 αi = 1 0 ≤ αi ≤ Cwi i = 1, n Density induced SVDD (D-SVDD):    min c∈IRp,R∈IR,ξ∈IRn R + C n i=1 ξi with wi xi − c 2 ≤ R+ξi ξi ≥ 0 i = 1, n
  • 20. Plan 1 Support Vector Data Description (SVDD) SVDD, the smallest enclosing ball problem The minimum enclosing ball problem with errors The minimum enclosing ball problem in a RKHS The two class Support vector data description (SVDD) Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 16 / 35
  • 21. SVDD in a RKHS The feature map: IRp −→ H c −→ f (•) xi −→ k(xi , •) xi − c IRp ≤ R2 −→ k(xi , •) − f (•) 2 H ≤ R2 Kernelized SVDD (in a RKHS) is also a QP    min f ∈H,R∈IR,ξ∈IRn R2 + C n i=1 ξi with k(xi , •) − f (•) 2 H ≤ R2 +ξi i = 1, n ξi ≥ 0 i = 1, n
  • 22. SVDD in a RKHS: KKT, Dual and R2 L = R2 + C n i=1 ξi + n i=1 αi k(xi , .) − f (.) 2 H − R2 −ξi − n i=1 βi ξi = R2 + C n i=1 ξi + n i=1 αi k(xi , xi ) − 2f (xi ) + f 2 H − R2 −ξi − n i=1 βi ξi KKT conditions Stationarity 2f (.) n i=1 αi − 2 n i=1 αi k(., xi ) = 0 ← The representer theorem 1 − n i=1 αi = 0 C − αi − βi = 0 Primal admissibility: k(xi , .) − f (.) 2 ≤ R2 + ξi , ξi ≥ 0 Dual admissibility: αi ≥ 0 , βi ≥ 0 Complementarity αi k(xi , .) − f (.) 2 − R2 − ξi = 0 βi ξi = 0
  • 23. SVDD in a RKHS: Dual and R2 L(α) = n i=1 αi k(xi , xi ) − 2 n i=1 f (xi ) + f 2 H with f (.) = n j=1 αj k(., xj ) = n i=1 αi k(xi , xi ) − n i=1 n j=1 αi αj k(xi , xj ) Gij Gij = k(xi , xj )    min α α Gα − α diag(G) with e α = 1 and 0 ≤ αi ≤ C, i = 1 . . . n As it is in the linear case: R2 = µ + f 2 H with µ denoting the Lagrange multiplier associated with the equality constraint n i=1 αi = 1.
  • 24. SVDD train and val in a RKHS Train using the dual form (in: G, C; out: α, µ)    min α α Gα − α diag(G) with e α = 1 and 0 ≤ αi ≤ C, i = 1 . . . n Val with the center in the RKHS: f (.) = n i=1 αi k(., xi ) φ(x) = k(x, .) − f (.) 2 H − R2 = k(x, .) 2 H − 2 k(x, .), f (.) H + f (.) 2 H − R2 = k(x, x) − 2f (x) + R2 − µ − R2 = −2f (x) + k(x, x) − µ = −2 n i=1 αi k(x, xi ) + k(x, x) − µ φ(x) = 0 is the decision border Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 20 / 35
  • 25. An important theoretical result For a well-calibrated bandwidth, The SVDD estimates the underlying distribution level set [Vert and Vert, 2006] The level sets of a probability density function IP(x) are the set Cp = {x ∈ IRd | IP(x) ≥ p} It is well estimated by the empirical minimum volume set Vp = {x ∈ IRd | k(x, .) − f (.) 2 H − R2 ≥ 0} The frontiers coincides
  • 26. SVDD: the generalization error For a well-calibrated bandwidth, (x1, . . . , xn) i.i.d. from some fixed but unknown IP(x) Then [Shawe-Taylor and Cristianini, 2004] with probability at least 1 − δ, (∀δ ∈]0, 1[), for any margin m > 0 IP k(x, .) − f (.) 2 H ≥ R2 + m ≤ 1 mn n i=1 ξi + 6R2 m √ n + 3 ln(2/δ) 2n
  • 27. Equivalence between SVDD and OCSVM for translation invariant kernels (diagonal constant kernels) Theorem Let H be a RKHS on some domain X endowed with kernel k. If there exists some constant c such that ∀x ∈ X, k(x, x) = c, then the two following problems are equivalent,    min f ,R,ξ R + C n i=1 ξi with k(xi , .) − f (.) 2 H ≤ R+ξi ξi ≥ 0 i = 1, n    min f ,ρ,ξ 1 2 f 2 H − ρ + C n i=1 εi with f (xi ) ≥ ρ − εi εi ≥ 0 i = 1, n with ρ = 1 2(c + f 2 H − R) and εi = 1 2ξi .
  • 28. Proof of the Equivalence between SVDD and OCSVM    min f ∈H,R∈IR,ξ∈IRn R + C n i=1 ξi with k(xi , .) − f (.) 2 H ≤ R+ξi , ξi ≥ 0 i = 1, n since k(xi , .) − f (.) 2 H = k(xi , xi ) + f 2 H − 2f (xi )    min f ∈H,R∈IR,ξ∈IRn R + C n i=1 ξi with 2f (xi ) ≥ k(xi , xi ) + f 2 H − R−ξi , ξi ≥ 0 i = 1, n. Introducing ρ = 1 2 (c + f 2 H − R) that is R = c + f 2 H − 2ρ, and since k(xi , xi ) is constant and equals to c the SVDD problem becomes    min f ∈H,ρ∈IR,ξ∈IRn 1 2 f 2 H − ρ + C 2 n i=1 ξi with f (xi ) ≥ ρ−1 2 ξi , ξi ≥ 0 i = 1, n
  • 29. leading to the classical one class SVM formulation (OCSVM)    min f ∈H,ρ∈IR,ξ∈IRn 1 2 f 2 H − ρ + C n i=1 εi with f (xi ) ≥ ρ − εi , εi ≥ 0 i = 1, n with εi = 1 2 ξi . Note that by putting ν = 1 nC we can get the so called ν formulation of the OCSVM    min f ∈H,ρ ∈IR,ξ ∈IRn 1 2 f 2 H − nνρ + n i=1 ξi with f (xi ) ≥ ρ − ξi , ξi ≥ 0 i = 1, n with f = Cf , ρ = Cρ, and ξ = Cξ.
  • 30. Duality Note that the dual of the SVDD is min α∈IRn α Gα − α g with n i=1 αi = 1 0 ≤ αi ≤ C i = 1, n where G is the kernel matrix of general term Gi,j = k(xi , xj ) and g the diagonal vector such that gi = k(xi , xi ) = c. The dual of the OCSVM is the following equivalent QP min α∈IRn 1 2α Gα with n i=1 αi = 1 0 ≤ αi ≤ C i = 1, n Both dual forms provide the same solution α, but not the same Lagrange multipliers. ρ is the Lagrange multiplier of the equality constraint of the dual of the OCSVM and R = c + α Gα − 2ρ. Using the SVDD dual, it turns out that R = λeq + α Gα where λeq is the Lagrange multiplier of the equality constraint of the SVDD dual form.
  • 31. Plan 1 Support Vector Data Description (SVDD) SVDD, the smallest enclosing ball problem The minimum enclosing ball problem with errors The minimum enclosing ball problem in a RKHS The two class Support vector data description (SVDD) Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 27 / 35
  • 32. The two class Support vector data description (SVDD) −4 −3 −2 −1 0 1 2 3 −3 −2 −1 0 1 2 3 4 −4 −3 −2 −1 0 1 2 3 −3 −2 −1 0 1 2 3 4 .    min c,R,ξ+,ξ− R2 +C yi =1 ξ+ i + yi =−1 ξ− i with xi − c 2 ≤ R2 +ξ+ i , ξ+ i ≥ 0 i such that yi = 1 and xi − c 2 ≥ R2 −ξ− i , ξ− i ≥ 0 i such that yi = −1 Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 28 / 35
  • 33. The two class SVDD as a QP    min c,R,ξ+,ξ− R2 +C yi =1 ξ+ i + yi =−1 ξ− i with xi − c 2 ≤ R2 +ξ+ i , ξ+ i ≥ 0 i such that yi = 1 and xi − c 2 ≥ R2 −ξ− i , ξ− i ≥ 0 i such that yi = −1 xi 2 − 2xi c + c 2 ≤ R2 +ξ+ i , ξ+ i ≥ 0 i such that yi = 1 xi 2 − 2xi c + c 2 ≥ R2 −ξ− i , ξ− i ≥ 0 i such that yi = −1 2xi c ≥ c 2 − R2 + xi 2 −ξ+ i , ξ+ i ≥ 0 i such that yi = 1 −2xi c ≥ − c 2 + R2 − xi 2 −ξ− i , ξ− i ≥ 0 i such that yi = −1 2yi xi c ≥ yi ( c 2 − R2 + xi 2 )−ξi , ξi ≥ 0 i = 1, n change variable: ρ = c 2 − R2    min c,ρ,ξ c 2 − ρ + C n i=1 ξi with 2yi xi c ≥ yi (ρ − xi 2 )−ξi i = 1, n and ξi ≥ 0 i = 1, n
  • 34. The dual of the two class SVDD Gij = yi yj xi xj The dual formulation:    min α∈IRn α Gα − n i=1 αi yi xi 2 with n i=1 yi αi = 1 0 ≤ αi ≤ C i = 1, n Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 30 / 35
  • 35. The two class SVDD vs. one class SVDD The two class SVDD (left) vs. the one class SVDD (right) Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 31 / 35
  • 36. Small Sphere and Large Margin (SSLM) approach Support vector data description with margin [Wu and Ye, 2009]    min w,R,ξ∈IRn R2 +C yi =1 ξ+ i + yi =−1 ξ− i with xi − c 2 ≤ R2 − 1+ξ+ i , ξ+ i ≥ 0 i such that yi = 1 and xi − c 2 ≥ R2 + 1−ξ− i , ξ− i ≥ 0 i such that yi = −1 xi − c 2 ≥ R2 + 1−ξ− i and yi = −1 ⇐⇒ yi xi − c 2 ≤ yi R2 − 1+ξ− i L(c, R, ξ, α, β) = R2 +C n i=1 ξi + n i=1 αi yi xi − c 2 − yi R2 + 1−ξi − n i=1 βi ξi −4 −3 −2 −1 0 1 2 3 −3 −2 −1 0 1 2 3 4
  • 37. SVDD with margin – dual formulation L(c, R, ξ, α, β) = R2 +C n i=1 ξi + n i=1 αi yi xi − c 2 − yi R2 + 1−ξi − n i=1 βi ξi Optimality: c = n i=1 αi yi xi ; n i=1 αi yi = 1 ; 0 ≤ αi ≤ C L(α) = n i=1 αi yi xi − n j=1 αi yj xj 2 + n i=1 αi = − n i=1 n j=1 αj αi yi yj xj xi + n i=1 xi 2 yi αi + n i=1 αi Dual SVDD is also a quadratic program problem D    min α∈IRn α Gα − e α − f α with y α = 1 and 0 ≤ αi ≤ C i = 1, n with G a symmetric matrix n × n such that Gij = yi yj xj xi and fi = xi 2 yi
  • 38. Conclusion Applications outlier detection change detection clustering large number of classes variable selection, . . . A clear path reformulation (to a standart problem) KKT Dual Bidual a lot of variations L2 SVDD two classes non symmetric two classes in the symmetric classes (SVM) the multi classes issue practical problems with translation invariant kernels .
  • 39. Bibliography Bo Liu, Yanshan Xiao, Longbing Cao, Zhifeng Hao, and Feiqi Deng. Svdd-based outlier detection on uncertain data. Knowledge and information systems, 34(3):597–618, 2013. B. Schölkopf and A. J. Smola. Learning with Kernels. MIT Press, 2002. John Shawe-Taylor and Nello Cristianini. Kernel methods for pattern analysis. Cambridge university press, 2004. David MJ Tax and Robert PW Duin. Support vector data description. Machine learning, 54(1):45–66, 2004. Régis Vert and Jean-Philippe Vert. Consistency and convergence rates of one-class svms and related algorithms. The Journal of Machine Learning Research, 7:817–854, 2006. Mingrui Wu and Jieping Ye. A small sphere and large margin approach for novelty detection using training data with outliers. Pattern Analysis and Machine Intelligence, IEEE Transactions on, 31(11):2088–2092, 2009. Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 35 / 35