Your SlideShare is downloading.
×

×

Saving this for later?
Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.

Text the download link to your phone

Standard text messaging rates apply

Like this presentation? Why not share!

- Spectral clustering Tutorial by Zitao Liu 3507 views
- Transfer learningforclp by Danushka Bollegala 1209 views
- Notes on Spectral Clustering by Davide Eynard 1957 views
- 深層学習入門 by Danushka Bollegala 20233 views
- IJCAI13 Paper review: Large-scale s... by Akisato Kimura 973 views
- Spectral clustering by So Yeon Kim 45 views
- Graph based approaches to Gene Expr... by Govind Maheswaran 881 views
- Spectral Analysis of Signed Graphs ... by Jérôme Kunegis 855 views
- Network Growth and the Spectral Evo... by Jérôme Kunegis 430 views
- Year 12 Maths A Textbook - Chapter 7 by westy67968 804 views
- Cuhk system 14oct by multimediaeval 35 views
- [DCSB] Dr Diego Jiménez-Badillo (IN... by Digital Classicis... 527 views

1,868

Published on

No Downloads

Total Views

1,868

On Slideshare

0

From Embeds

0

Number of Embeds

5

Shares

0

Downloads

41

Comments

0

Likes

1

No embeds

No notes for slide

- 1. Presented by Danushka Bollegala
- 2. Spectrum = the set of eigenvalues By looking at the spectrum we can know about the graph itself! A way of normalizing data (canonical form) and then perform clustering (e.g. via k- means) on this normalized/reduced space. Input: A similarity matrix Output: A set of (non-overlapping/hard) clusters.
- 3. UndirectedGraph G(V, E) V: set of vertices (nodes in the network) E: set of edges (links in the network) ▪ Weight wij is the weight of the edge connecting vertex I and j (represented by the affinity matrix.) Degree: sum of weights on outgoing edges of a vertex. Measuring the size of a subset A ofV
- 4. How to create the affinity matrixW from the similarity matrix S? ε-neighborhood graph ▪ Connect all vertices that have similarity greater than ε k-nearest neighbor graph ▪ Connect the k-nearest neighbors of each vertex. ▪ Mutual k-nearest neighbor graphs for asymmetric S. Fully connected graph ▪ Use the Gaussian similarity function (kernel)
- 5. L = D –W D: degree matrix. A diagonal matrix diag(d1,...,dn) Properties For every vector L is symmetric and positive semi-definite The smallest eigenvalue of L is zero and the corresponding eigenvector is 1 = (1,...,1)T L has n non-negative, real-valued eigenvalues
- 6. Two versions exist Lsym = D-1/2LD-1/2 = I - D-1/2WD-1/2 Lrw = D-1L = I - D-1W
- 7. The partition (A1,...,Ak) induces a cut on the graph Two types of graph cuts exist Spectral clustering solves a relaxed version of the mincut problem (therefore it is an approximation)
- 8. By the Rayleigh-Ritz theorem it follows that the second eigenvalue is the minimum.
- 9. Transition probability matrix and Laplacian are related! P = D-1W Lrw = I - P
- 10. Lrw based spectral clustering (Shi & Malik,2000) is better (especially when the degree distribution is uneven). Use k-nearest neighbor graphs How to set the number of clusters: k=log(n) Use the eigengap heuristic If using Gaussian kernel how to set sigma Mean distance of a point to its log(n)+1 nearest neighbors.
- 11. Eckart-YoungTheorem The low-rank approximation B for a matrix A s.t. rank(B) = r < rank(A) is given by, B = USV*, where A = UZV* and S is the same as Z except the (r+1) and above singular values of Z are set to zero. Approximation is done by minimizing the Frobenius norm ▪ minB||A – B||F, subject to rank(B) = r

Be the first to comment