This document provides an overview of Support Vector Data Description (SVDD), which finds the minimum enclosing ball that encapsulates a set of data points. It discusses how SVDD can be formulated as a quadratic programming problem and outlines its dual formulation. The document also notes that SVDD generalizes to non-linear settings using kernels, and discusses variations like adaptive SVDD and density-induced SVDD. Key points covered include the representer theorem, KKT conditions, and how the radius of the enclosing ball can be determined from the Lagrangian.
안녕하세요. 이동민입니다. :)
2018. 8. 9일에 한국항공우주연구원에서 발표한 "Safe Reinforcement Learning" 발표 자료입니다.
목차는 다음과 같습니다.
1. Reinforcement Learning
2. Safe Reinforcement Learning
3. Optimization Criterion
4. Exploration Process
강화학습 계속 공부하면서 실제로 많은 분들이 쓸 수 있게 하려면 더 안전하고 빨라야한다는 생각이 들었습니다. 그래서 이에 관련하여 논문과 각종 자료들로 공부하여 발표하였습니다.
많은 분들께 도움이 되었으면 좋겠습니다. 감사합니다!
Efficient Estimation for High Similarities using Odd Sketches AXEL FOTSO
These are the slides used to present the paper: "Efficient Estimation for High Similarities using Odd Sketches" produced for the course Softskills seminar at Telecom Paristech. The paper present a compact binary skecth for an optimal estimation of sets similarity.
In machine learning, support vector machines (SVMs, also support vector networks[1]) are supervised learning models with associated learning algorithms that analyze data and recognize patterns, used for classification and regression analysis. The basic SVM takes a set of input data and predicts, for each given input, which of two possible classes forms the output, making it a non-probabilistic binary linear classifier.
안녕하세요. 이동민입니다. :)
2018. 8. 9일에 한국항공우주연구원에서 발표한 "Safe Reinforcement Learning" 발표 자료입니다.
목차는 다음과 같습니다.
1. Reinforcement Learning
2. Safe Reinforcement Learning
3. Optimization Criterion
4. Exploration Process
강화학습 계속 공부하면서 실제로 많은 분들이 쓸 수 있게 하려면 더 안전하고 빨라야한다는 생각이 들었습니다. 그래서 이에 관련하여 논문과 각종 자료들로 공부하여 발표하였습니다.
많은 분들께 도움이 되었으면 좋겠습니다. 감사합니다!
Efficient Estimation for High Similarities using Odd Sketches AXEL FOTSO
These are the slides used to present the paper: "Efficient Estimation for High Similarities using Odd Sketches" produced for the course Softskills seminar at Telecom Paristech. The paper present a compact binary skecth for an optimal estimation of sets similarity.
In machine learning, support vector machines (SVMs, also support vector networks[1]) are supervised learning models with associated learning algorithms that analyze data and recognize patterns, used for classification and regression analysis. The basic SVM takes a set of input data and predicts, for each given input, which of two possible classes forms the output, making it a non-probabilistic binary linear classifier.
Positive and negative solutions of a boundary value problem for a fractional ...journal ijrtem
: In this work, we study a boundary value problem for a fractional
q, -difference equation. By
using the monotone iterative technique and lower-upper solution method, we get the existence of positive or
negative solutions under the nonlinear term is local continuity and local monotonicity. The results show that we
can construct two iterative sequences for approximating the solutions
Practical and Worst-Case Efficient ApportionmentRaphael Reitzig
Proportional apportionment is the problem of assigning seats to parties according to their relative share of votes. Divisor methods are the de-facto standard solution, used in many countries.
In recent literature, there are two algorithms that implement divisor methods: one by Cheng and Eppstein (ISAAC, 2014) has worst-case optimal running time but is complex, while the other (Pukelsheim, 2014) is relatively simple and fast in practice but does not offer worst-case guarantees.
This talk presents the ideas behind a novel algorithm that avoids the shortcomings of both. We investigate the three contenders in order to determine which is most useful in practice.
Read more over here: http://reitzig.github.io/publications/RW2015b
We present a proof of the Generalized Riemann hypothesis (GRH) based on asymptotic expansions and operations on series. The advantage of our method is that it only uses undergraduate maths which makes it accessible to a wider audience.
We present a proof of the Generalized Riemann hypothesis (GRH) based on asymptotic expansions and operations on series. The advantage of our method is that it only uses undergraduate maths which makes it accessible to a wider audience.
Classification with mixtures of curved Mahalanobis metricsFrank Nielsen
Presentation at ICIP 2016.
Slide 4, there is a typo, replace absolute value by parenthesis. The cross-ratio can be negative and we use the principal complex logarithm
Multidimensional integrals may be approximated by weighted averages of integrand values. Quasi-Monte Carlo (QMC) methods are more accurate than simple Monte Carlo methods because they carefully choose where to evaluate the integrand. This tutorial focuses on how quickly QMC methods converge to the correct answer as the number of integrand values increases. The answer may depend on the smoothness of the integrand and the sophistication of the QMC method. QMC error analysis may assumes the integrand belongs to a reproducing kernel Hilbert space or may assume that the integrand is an instance of a stochastic process with known covariance structure. These two approaches have interesting parallels. This tutorial also explores how the computational cost of achieving a good approximation to the integral depends on the dimension of the domain of the integrand. Finally, this tutorial explores methods for determining how many integrand values are needed to satisfy the error tolerance. Relevant software is described.
Generating a custom Ruby SDK for your web service or Rails API using Smithyg2nightmarescribd
Have you ever wanted a Ruby client API to communicate with your web service? Smithy is a protocol-agnostic language for defining services and SDKs. Smithy Ruby is an implementation of Smithy that generates a Ruby SDK using a Smithy model. In this talk, we will explore Smithy and Smithy Ruby to learn how to generate custom feature-rich SDKs that can communicate with any web service, such as a Rails JSON API.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered Quality
Lecture6 svdd
1. Lecture 6: Minimum encoding ball and Support vector
data description (SVDD)
Stéphane Canu
stephane.canu@litislab.eu
Sao Paulo 2014
May 12, 2014
2. Plan
1 Support Vector Data Description (SVDD)
SVDD, the smallest enclosing ball problem
The minimum enclosing ball problem with errors
The minimum enclosing ball problem in a RKHS
The two class Support vector data description (SVDD)
3. The minimum enclosing ball problem [Tax and Duin, 2004]
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 3 / 35
4. The minimum enclosing ball problem [Tax and Duin, 2004]
the center
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 3 / 35
5. The minimum enclosing ball problem [Tax and Duin, 2004]
the radius
Given n points, {xi , i = 1, n} .
min
R∈IR,c∈IRd
R2
with xi − c 2
≤ R2
, i = 1, . . . , n
What is that in the convex programming hierarchy?
LP, QP, QCQP, SOCP and SDP
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 3 / 35
6. The convex programming hierarchy (part of)
LP
min
x
f x
with Ax ≤ d
and 0 ≤ x
QP
min
x
1
2 x Gx + f x
with Ax ≤ d
QCQP
min
x
1
2 x Gx + f x
with x Bi x + ai x ≤ di
i = 1, n
SOCP
min
x
f x
with x − ai ≤ bi x + di
i = 1, n
The convex programming hierarchy?
Model generality: LP < QP < QCQP < SOCP < SDP
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 4 / 35
7. MEB as a QP in the primal
Theorem (MEB as a QP)
The two following problems are equivalent,
min
R∈IR,c∈IRd
R2
with xi − c 2
≤ R2
, i = 1, . . . , n
min
w,ρ
1
2 w 2
− ρ
with w xi ≥ ρ + 1
2 xi
2
with ρ = 1
2 ( c 2
− R2
) and w = c.
Proof:
xi − c 2
≤ R2
xi
2
− 2xi c + c 2
≤ R2
−2xi c ≤ R2
− xi
2
− c 2
2xi c ≥ −R2
+ xi
2
+ c 2
xi c ≥
1
2
( c 2
− R2
)
ρ
+1
2 xi
2
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 5 / 35
8. MEB and the one class SVM
SVDD:
min
w,ρ
1
2 w 2
− ρ
with w xi ≥ ρ + 1
2 xi
2
SVDD and linear OCSVM (Supporting Hyperplane)
if ∀i = 1, n, xi
2
= constant, it is the the linear one class SVM (OC SVM)
The linear one class SVM [Schölkopf and Smola, 2002]
min
w,ρ
1
2 w 2
− ρ
with w xi ≥ ρ
with ρ = ρ + 1
2 xi
2
⇒ OC SVM is a particular case of SVDD
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 6 / 35
9. When ∀i = 1, n, xi
2
= 1
0
c
xi − c 2
≤ R2
⇔ w xi ≥ ρ
with
ρ =
1
2
( c 2
− R + 1)
SVDD and OCSVM
"Belonging to the ball" is also "being above" an hyperplane
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 7 / 35
10. MEB: KKT
L(c, R, α) = R2
+
n
i=1
αi xi − c 2
− R2
KKT conditionns :
stationarty 2c
n
i=1
αi − 2
n
i=1
αi xi = 0 ← The representer theorem
1 −
n
i=1
αi = 0
primal admiss. xi − c 2
≤ R2
dual admiss. αi ≥ 0 i = 1, n
complementarity αi xi − c 2
− R2
= 0 i = 1, n
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 8 / 35
11. MEB: KKT
the radius
L(c, R, α) = R2
+
n
i=1
αi xi − c 2
− R2
KKT conditionns :
stationarty 2c
n
i=1
αi − 2
n
i=1
αi xi = 0 ← The representer theorem
1 −
n
i=1
αi = 0
primal admiss. xi − c 2
≤ R2
dual admiss. αi ≥ 0 i = 1, n
complementarity αi xi − c 2
− R2
= 0 i = 1, n
Complementarity tells us: two groups of points
the support vectors xi − c 2
= R2
and the insiders αi = 0
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 8 / 35
12. MEB: Dual
The representer theorem:
c =
n
i=1
αi xi
n
i=1
αi
=
n
i=1
αi xi
L(α) =
n
i=1
αi xi −
n
j=1
αj xj
2
n
i=1
n
j=1
αi αj xi xj = α Gα and
n
i=1
αi xi xi = α diag(G)
with G = XX the Gram matrix: Gij = xi xj ,
min
α∈IRn
α Gα − α diag(G)
with e α = 1
and 0 ≤ αi , i = 1 . . . n
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 9 / 35
13. SVDD primal vs. dual
Primal
min
R∈IR,c∈IRd
R2
with xi − c 2 ≤ R2,
i = 1, . . . , n
d + 1 unknown
n constraints
can be recast as a QP
perfect when d << n
Dual
min
α
α Gα − α diag(G)
with e α = 1
and 0 ≤ αi ,
i = 1 . . . n
n unknown with G the pairwise
influence Gram matrix
n box constraints
easy to solve
to be used when d > n
14. SVDD primal vs. dual
Primal
min
R∈IR,c∈IRd
R2
with xi − c 2 ≤ R2,
i = 1, . . . , n
d + 1 unknown
n constraints
can be recast as a QP
perfect when d << n
Dual
min
α
α Gα − α diag(G)
with e α = 1
and 0 ≤ αi ,
i = 1 . . . n
n unknown with G the pairwise
influence Gram matrix
n box constraints
easy to solve
to be used when d > n
But where is R2?
15. Looking for R2
min
α
α Gα − α diag(G)
with e α = 1, 0 ≤ αi , i = 1, n
The Lagrangian: L(α, µ, β) = α Gα − α diag(G) + µ(e α − 1) − β α
Stationarity cond.: αL(α, µ, β) = 2Gα − diag(G) + µe − β = 0
The bi dual
min
α
α Gα + µ
with −2Gα + diag(G) ≤ µe
by identification
R2
= µ + α Gα = µ + c 2
µ is the Lagrange multiplier associated with the equality constraint
n
i=1
αi = 1
Also, because of the complementarity condition, if xi is a support vector, then
βi = 0 implies αi > 0 and R2
= xi − c 2
.
16. Plan
1 Support Vector Data Description (SVDD)
SVDD, the smallest enclosing ball problem
The minimum enclosing ball problem with errors
The minimum enclosing ball problem in a RKHS
The two class Support vector data description (SVDD)
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 12 / 35
17. The minimum enclosing ball problem with errors
the slack
The same road map:
initial formuation
reformulation (as a QP)
Lagrangian, KKT
dual formulation
bi dual
Initial formulation: for a given C
min
R,a,ξ
R2
+ C
n
i=1
ξi
with xi − c 2
≤ R2
+ ξi , i = 1, . . . , n
and ξi ≥ 0, i = 1, . . . , n
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 13 / 35
18. The MEB with slack: QP, KKT, dual and R2
SVDD as a QP:
min
w,ρ
1
2 w 2
− ρ + C
2
n
i=1
ξi
with w xi ≥ ρ + 1
2 xi
2
− 1
2 ξi
and ξi ≥ 0,
i = 1, n
again with OC SVM as a particular case.
With G = XX
Dual SVDD:
min
α
α Gα − α diag(G)
with e α = 1
and 0 ≤ αi ≤ C,
i = 1, n
for a given C ≤ 1. If C is larger than one it is useless (it’s the no slack case)
R2
= µ + c c
with µ denoting the Lagrange multiplier associated with the equality
constraint n
i=1 αi = 1.
19. Variations over SVDD
Adaptive SVDD: the weighted error case for given wi , i = 1, n
min
c∈IRp,R∈IR,ξ∈IRn
R + C
n
i=1
wi ξi
with xi − c 2
≤ R+ξi
ξi ≥ 0 i = 1, n
The dual of this problem is a QP [see for instance Liu et al., 2013]
min
α∈IRn
α XX α − α diag(XX )
with
n
i=1 αi = 1 0 ≤ αi ≤ Cwi i = 1, n
Density induced SVDD (D-SVDD):
min
c∈IRp,R∈IR,ξ∈IRn
R + C
n
i=1
ξi
with wi xi − c 2
≤ R+ξi
ξi ≥ 0 i = 1, n
20. Plan
1 Support Vector Data Description (SVDD)
SVDD, the smallest enclosing ball problem
The minimum enclosing ball problem with errors
The minimum enclosing ball problem in a RKHS
The two class Support vector data description (SVDD)
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 16 / 35
21. SVDD in a RKHS
The feature map: IRp
−→ H
c −→ f (•)
xi −→ k(xi , •)
xi − c IRp ≤ R2
−→ k(xi , •) − f (•) 2
H ≤ R2
Kernelized SVDD (in a RKHS) is also a QP
min
f ∈H,R∈IR,ξ∈IRn
R2
+ C
n
i=1
ξi
with k(xi , •) − f (•) 2
H ≤ R2
+ξi i = 1, n
ξi ≥ 0 i = 1, n
22. SVDD in a RKHS: KKT, Dual and R2
L = R2
+ C
n
i=1
ξi +
n
i=1
αi k(xi , .) − f (.) 2
H − R2
−ξi −
n
i=1
βi ξi
= R2
+ C
n
i=1
ξi +
n
i=1
αi k(xi , xi ) − 2f (xi ) + f 2
H − R2
−ξi −
n
i=1
βi ξi
KKT conditions
Stationarity
2f (.)
n
i=1 αi − 2
n
i=1 αi k(., xi ) = 0 ← The representer theorem
1 −
n
i=1 αi = 0
C − αi − βi = 0
Primal admissibility: k(xi , .) − f (.) 2
≤ R2
+ ξi , ξi ≥ 0
Dual admissibility: αi ≥ 0 , βi ≥ 0
Complementarity
αi k(xi , .) − f (.) 2
− R2
− ξi = 0
βi ξi = 0
23. SVDD in a RKHS: Dual and R2
L(α) =
n
i=1
αi k(xi , xi ) − 2
n
i=1
f (xi ) + f 2
H with f (.) =
n
j=1
αj k(., xj )
=
n
i=1
αi k(xi , xi ) −
n
i=1
n
j=1
αi αj k(xi , xj )
Gij
Gij = k(xi , xj )
min
α
α Gα − α diag(G)
with e α = 1
and 0 ≤ αi ≤ C, i = 1 . . . n
As it is in the linear case:
R2
= µ + f 2
H
with µ denoting the Lagrange multiplier associated with the equality
constraint n
i=1 αi = 1.
24. SVDD train and val in a RKHS
Train using the dual form (in: G, C; out: α, µ)
min
α
α Gα − α diag(G)
with e α = 1
and 0 ≤ αi ≤ C, i = 1 . . . n
Val with the center in the RKHS: f (.) =
n
i=1 αi k(., xi )
φ(x) = k(x, .) − f (.) 2
H − R2
= k(x, .) 2
H − 2 k(x, .), f (.) H + f (.) 2
H − R2
= k(x, x) − 2f (x) + R2
− µ − R2
= −2f (x) + k(x, x) − µ
= −2
n
i=1
αi k(x, xi ) + k(x, x) − µ
φ(x) = 0 is the decision border
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 20 / 35
25. An important theoretical result
For a well-calibrated bandwidth,
The SVDD estimates the underlying distribution level set [Vert and Vert,
2006]
The level sets of a probability density function IP(x) are the set
Cp = {x ∈ IRd
| IP(x) ≥ p}
It is well estimated by the empirical minimum volume set
Vp = {x ∈ IRd
| k(x, .) − f (.) 2
H − R2
≥ 0}
The frontiers coincides
26. SVDD: the generalization error
For a well-calibrated bandwidth,
(x1, . . . , xn) i.i.d. from some fixed but unknown IP(x)
Then [Shawe-Taylor and Cristianini, 2004] with probability at least 1 − δ,
(∀δ ∈]0, 1[), for any margin m > 0
IP k(x, .) − f (.) 2
H ≥ R2
+ m ≤
1
mn
n
i=1
ξi +
6R2
m
√
n
+ 3
ln(2/δ)
2n
27. Equivalence between SVDD and OCSVM for translation
invariant kernels (diagonal constant kernels)
Theorem
Let H be a RKHS on some domain X endowed with kernel k. If there
exists some constant c such that ∀x ∈ X, k(x, x) = c, then the two
following problems are equivalent,
min
f ,R,ξ
R + C
n
i=1
ξi
with k(xi , .) − f (.) 2
H ≤ R+ξi
ξi ≥ 0 i = 1, n
min
f ,ρ,ξ
1
2 f 2
H − ρ + C
n
i=1
εi
with f (xi ) ≥ ρ − εi
εi ≥ 0 i = 1, n
with ρ = 1
2(c + f 2
H − R) and εi = 1
2ξi .
28. Proof of the Equivalence between SVDD and OCSVM
min
f ∈H,R∈IR,ξ∈IRn
R + C
n
i=1
ξi
with k(xi , .) − f (.) 2
H ≤ R+ξi , ξi ≥ 0 i = 1, n
since k(xi , .) − f (.) 2
H = k(xi , xi ) + f 2
H − 2f (xi )
min
f ∈H,R∈IR,ξ∈IRn
R + C
n
i=1
ξi
with 2f (xi ) ≥ k(xi , xi ) + f 2
H − R−ξi , ξi ≥ 0 i = 1, n.
Introducing ρ = 1
2 (c + f 2
H − R) that is R = c + f 2
H − 2ρ, and since k(xi , xi )
is constant and equals to c the SVDD problem becomes
min
f ∈H,ρ∈IR,ξ∈IRn
1
2 f 2
H − ρ + C
2
n
i=1
ξi
with f (xi ) ≥ ρ−1
2 ξi , ξi ≥ 0 i = 1, n
29. leading to the classical one class SVM formulation (OCSVM)
min
f ∈H,ρ∈IR,ξ∈IRn
1
2 f 2
H − ρ + C
n
i=1
εi
with f (xi ) ≥ ρ − εi , εi ≥ 0 i = 1, n
with εi = 1
2 ξi . Note that by putting ν = 1
nC we can get the so called ν
formulation of the OCSVM
min
f ∈H,ρ ∈IR,ξ ∈IRn
1
2 f 2
H − nνρ +
n
i=1
ξi
with f (xi ) ≥ ρ − ξi , ξi ≥ 0 i = 1, n
with f = Cf , ρ = Cρ, and ξ = Cξ.
30. Duality
Note that the dual of the SVDD is
min
α∈IRn
α Gα − α g
with n
i=1 αi = 1 0 ≤ αi ≤ C i = 1, n
where G is the kernel matrix of general term Gi,j = k(xi , xj ) and g the
diagonal vector such that gi = k(xi , xi ) = c. The dual of the OCSVM is
the following equivalent QP
min
α∈IRn
1
2α Gα
with n
i=1 αi = 1 0 ≤ αi ≤ C i = 1, n
Both dual forms provide the same solution α, but not the same Lagrange
multipliers. ρ is the Lagrange multiplier of the equality constraint of the
dual of the OCSVM and R = c + α Gα − 2ρ. Using the SVDD dual, it
turns out that R = λeq + α Gα where λeq is the Lagrange multiplier of
the equality constraint of the SVDD dual form.
31. Plan
1 Support Vector Data Description (SVDD)
SVDD, the smallest enclosing ball problem
The minimum enclosing ball problem with errors
The minimum enclosing ball problem in a RKHS
The two class Support vector data description (SVDD)
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 27 / 35
32. The two class Support vector data description (SVDD)
−4 −3 −2 −1 0 1 2 3
−3
−2
−1
0
1
2
3
4
−4 −3 −2 −1 0 1 2 3
−3
−2
−1
0
1
2
3
4
.
min
c,R,ξ+,ξ−
R2
+C
yi =1
ξ+
i +
yi =−1
ξ−
i
with xi − c 2
≤ R2
+ξ+
i , ξ+
i ≥ 0 i such that yi = 1
and xi − c 2
≥ R2
−ξ−
i , ξ−
i ≥ 0 i such that yi = −1
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 28 / 35
33. The two class SVDD as a QP
min
c,R,ξ+,ξ−
R2
+C
yi =1
ξ+
i +
yi =−1
ξ−
i
with xi − c 2
≤ R2
+ξ+
i , ξ+
i ≥ 0 i such that yi = 1
and xi − c 2
≥ R2
−ξ−
i , ξ−
i ≥ 0 i such that yi = −1
xi
2
− 2xi c + c 2
≤ R2
+ξ+
i , ξ+
i ≥ 0 i such that yi = 1
xi
2
− 2xi c + c 2
≥ R2
−ξ−
i , ξ−
i ≥ 0 i such that yi = −1
2xi c ≥ c 2
− R2
+ xi
2
−ξ+
i , ξ+
i ≥ 0 i such that yi = 1
−2xi c ≥ − c 2
+ R2
− xi
2
−ξ−
i , ξ−
i ≥ 0 i such that yi = −1
2yi xi c ≥ yi ( c 2
− R2
+ xi
2
)−ξi , ξi ≥ 0 i = 1, n
change variable: ρ = c 2 − R2
min
c,ρ,ξ
c 2
− ρ + C
n
i=1 ξi
with 2yi xi c ≥ yi (ρ − xi
2
)−ξi i = 1, n
and ξi ≥ 0 i = 1, n
34. The dual of the two class SVDD
Gij = yi yj xi xj
The dual formulation:
min
α∈IRn
α Gα −
n
i=1 αi yi xi
2
with
n
i=1
yi αi = 1
0 ≤ αi ≤ C i = 1, n
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 30 / 35
35. The two class SVDD vs. one class SVDD
The two class SVDD (left) vs. the one class SVDD (right)
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 31 / 35
36. Small Sphere and Large Margin (SSLM) approach
Support vector data description with margin [Wu and Ye, 2009]
min
w,R,ξ∈IRn
R2
+C
yi =1
ξ+
i +
yi =−1
ξ−
i
with xi − c 2
≤ R2
− 1+ξ+
i , ξ+
i ≥ 0 i such that yi = 1
and xi − c 2
≥ R2
+ 1−ξ−
i , ξ−
i ≥ 0 i such that yi = −1
xi − c 2
≥ R2
+ 1−ξ−
i and yi = −1 ⇐⇒ yi xi − c 2
≤ yi R2
− 1+ξ−
i
L(c, R, ξ, α, β) = R2
+C
n
i=1
ξi +
n
i=1
αi yi xi − c 2
− yi R2
+ 1−ξi −
n
i=1
βi ξi
−4 −3 −2 −1 0 1 2 3
−3
−2
−1
0
1
2
3
4
37. SVDD with margin – dual formulation
L(c, R, ξ, α, β) = R2
+C
n
i=1
ξi +
n
i=1
αi yi xi − c 2
− yi R2
+ 1−ξi −
n
i=1
βi ξi
Optimality: c =
n
i=1
αi yi xi ;
n
i=1
αi yi = 1 ; 0 ≤ αi ≤ C
L(α) =
n
i=1
αi yi xi −
n
j=1
αi yj xj
2
+
n
i=1
αi
= −
n
i=1
n
j=1
αj αi yi yj xj xi +
n
i=1
xi
2
yi αi +
n
i=1
αi
Dual SVDD is also a quadratic program
problem D
min
α∈IRn
α Gα − e α − f α
with y α = 1
and 0 ≤ αi ≤ C i = 1, n
with G a symmetric matrix n × n such that Gij = yi yj xj xi and fi = xi
2
yi
38. Conclusion
Applications
outlier detection
change detection
clustering
large number of classes
variable selection, . . .
A clear path
reformulation (to a standart problem)
KKT
Dual
Bidual
a lot of variations
L2
SVDD
two classes non symmetric
two classes in the symmetric classes (SVM)
the multi classes issue
practical problems with translation invariant
kernels
.
39. Bibliography
Bo Liu, Yanshan Xiao, Longbing Cao, Zhifeng Hao, and Feiqi Deng.
Svdd-based outlier detection on uncertain data. Knowledge and
information systems, 34(3):597–618, 2013.
B. Schölkopf and A. J. Smola. Learning with Kernels. MIT Press, 2002.
John Shawe-Taylor and Nello Cristianini. Kernel methods for pattern
analysis. Cambridge university press, 2004.
David MJ Tax and Robert PW Duin. Support vector data description.
Machine learning, 54(1):45–66, 2004.
Régis Vert and Jean-Philippe Vert. Consistency and convergence rates of
one-class svms and related algorithms. The Journal of Machine Learning
Research, 7:817–854, 2006.
Mingrui Wu and Jieping Ye. A small sphere and large margin approach for
novelty detection using training data with outliers. Pattern Analysis and
Machine Intelligence, IEEE Transactions on, 31(11):2088–2092, 2009.
Stéphane Canu (INSA Rouen - LITIS) May 12, 2014 35 / 35