Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our User Agreement and Privacy Policy.

Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our Privacy Policy and User Agreement for details.

Successfully reported this slideshow.

Like this presentation? Why not share!

- The Gaussian Process Latent Variabl... by James McMurray 10347 views
- Dimension Reduction And Visualizati... by wl820609 9785 views
- Topic Models by Claudia Wagner 18252 views
- Methods of Manifold Learning for Di... by Ryan B Harvey, CS... 5909 views
- 関東CV勉強会 Kernel PCA (2011.2.19) by Akisato Kimura 10886 views
- Self-organizing map by Tarat Diloksawatd... 17464 views

6,579 views

Published on

Published in:
Education

No Downloads

Total views

6,579

On SlideShare

0

From Embeds

0

Number of Embeds

4,093

Shares

0

Downloads

76

Comments

0

Likes

8

No embeds

No notes for slide

- 1. manifold learningwith applications to object recognitionAdvanced PerceptionDavid R. Thompson
- 2. agenda1. why learn manifolds?2. Isomap3. LLE4. applications
- 3. types of manifoldsexhaust Sir Waltermanifold Synnot Manifold 1849-1928 low-D surface embedded in high-D space
- 4. manifold learningFind a low-D basis fordescribing high-D data.X → X S.T.dim(X) << dim(X)uncovers the intrinsicdimensionality(invertible)
- 5. manifolds in visionplenoptic function / motion / occlusion
- 6. manifolds in vision appearance variationimages from hormel corp.
- 7. manifolds in vision deformationimages from www.golfswingphotos.com
- 8. why do manifold learning?1. data compression2. “curse of dimensionality”3. de-noising4. visualization5. reasonable distance metrics *
- 9. reasonable distance metrics
- 10. reasonable distance metrics
- 11. reasonable distance metrics ?
- 12. reasonable distance metrics ? linear interpolation
- 13. reasonable distance metrics ? manifold interpolation
- 14. agenda1. why learn manifolds?2. Isomap3. LLE4. applications
- 15. IsomapFor n data points, and a distance matrix D, i Dij = j...we can construct a m-dimensional space topreserve inter-point distances by using the topeigenvectors of D scaled by their eigenvalues. yi= [ λ1v1i , λ2v2i , ... , λmvmi ]
- 16. Isomap Infer a distance matrix using distances along the manifold.
- 17. Isomap1. Build a sparse graph with K-nearest neighborsDg =(distance matrix issparse)
- 18. Isomap2. Infer other interpoint distances by findingshortest paths on the graph (Dijkstrasalgorithm).Dg =
- 19. Isomap3. Build a low-D embedded space to bestpreserve the complete distance matrix.Error function: inner product distances in new inner product coordinate distances in system graph L2 normSolution – set points Y to top eigenvectors of Dg
- 20. Isomapshortest-distance on a graph is easy tocompute
- 21. Isomap results: hands
- 22. Isomap: pro and con- preserves global structure- few free parameters- sensitive to noise, noise edges- computationally expensive (densematrix eigen-reduction)
- 23. Locally Linear Embedding Find a mapping to preserve local linear relationships between neighbors
- 24. Locally Linear Embedding
- 25. LLE: Two key steps1. Find weight matrix W of linearcoefficients:Enforce sum-to-one constraint with theLagrange Multiplier:
- 26. LLE: Two key steps2. Find projected vectors Y to minimizereconstruction errormust solve for whole datasetsimultaneously
- 27. LLE: Two key stepsWe add constraints to preventmultiple / degenerate solutions:
- 28. LLE: Two key stepscost function becomes:the optimal embedded coordinates aregiven by bottom m+1 eigenvectors ofthe matrix M
- 29. LLE: Resultpreserves localtopology PCA LLE
- 30. LLE: pro and con- no local minima, one free parameter- incremental & fast- simple linear algebra operations- can distort global structure
- 31. Others you may encounter● Laplacian Eigenmaps (Belkin 2001) ● spectral method similar to LLE ● better preserves clusters in data● Kernel PCA● Kohonen Self-Organizing Map(Kohonen, 1990) ● iterative algorithm fits a network of pre- defined connectivity ● simple, fast for on-line learning ● local minima ● lacking theoretical justification
- 32. No Free Lunchthe “curvier” yourmanifold, the denser yourdata must be bad OK!
- 33. conclusionsManifold learning is a key tool in yourobject recognition toolboxA formal framework for many differentad-hoc object recognition techniques

No public clipboards found for this slide

Be the first to comment