Successfully reported this slideshow.
Upcoming SlideShare
×

# Semi supervised learning

1,018 views

Published on

This presentation gives an overview about semi-supervised learning methods (Least square solution, Eigen vectors and Eigen functions). It points to some of the applications these methods can be used like object categorization and Interactive Image segmentation

Published in: Education
• Full Name
Comment goes here.

Are you sure you want to Yes No
• Be the first to comment

### Semi supervised learning

1. 1. Semi-Supervised Learning Ahmed Taha Feb 2014 S
2. 2. Content S Concept Introduction S Graph cut and Least Square solution S Eigen vector and Eigen Functions S Application
3. 3. Concept Introduction S Graph Cut S Divide Graph Into two divisions S Lowest Cut cost ?
4. 4. Concept Introduction S Degree Matrix / variation
5. 5. Concept Introduction S Object Representation S 2D Point S 3D Point S Pixel S Or even a Whole Image S It is ALL nodes and Edges
6. 6. Concept Introduction Semi-supervised learning vs. Un-supervised learning S Un-supervised Learning (No Labeled Data)
7. 7. Concept Introduction Semi-supervised learning vs. Un-supervised learning S Semi-Supervised Learning (Labeled Data and structure of unlabeled Data)
8. 8. Graph Cut Least square Solution S Semi-Supervised Learning (Labeled Data) S We have 3 Objects Now This Should be a Fully Connected Graph
9. 9. Graph Cut Least square Solution S Objective separate graph into two part S (Red and Non-Red) S Size if this Matrix is S N^2 S Not sparse This Should be a Fully Connected Graph
10. 10. Graph Cut Least square Solution S We can after that divide the rest of graph into blue and not blue and so on S NP Problem ? This Should be a Fully Connected Graph
11. 11. Graph Cut Least square Solution S Current Situation , we have a fully connected Graph , represented in NxN Matrix = W (Similarity Matrix) S We expect each object to be assigned {1,-1} {Red, non- red} with lowest cost assignment cost S But this is NP ???
12. 12. Label Propagation Least square Solution S Weighted Average concept S New Node S (Red Now)1 * 1 S (Blue) -1 * 0.1 S Green -1 * 0.2 S 1-0.1-0.2 = 0.7 , S so it is probably a Red Object {1} 1 0.1 0.2
13. 13. Label Propagation Least square Solution S Here comes the first Equation , Lets define S Matrix W (NxN) , Similarity Between Objects S Matrix D (NxN), degree of each Object S Matrix L (Laplacian Matrix) = D – W S Label vector F (Nx1), assignment of each object [-1,1] and not {-1,1} S Objective Function  Min ½
14. 14. Least square Solution S Objective Function  Min ½ S But this doesn't’t consider Label data yet S After some Equation manipulation
15. 15. Least square Solution S We need to solve NxN S NxN matrix Inverse S NxN matrix multiplication S Need to reduce dimensions by using Eigenvectors of Graph Laplacian
16. 16. EigenVector S As mentioned before we want to have a Label vector f S f = α U , so once we have U, we can get α and then we get f S Laplacian Eigenmap dimension reduction S L have the characteristic is this Graph 1 0.1 S Mapping the objects into a new dimension 0.2
17. 17. EigenVector S As mentioned before we want to have a Label vector f S Get the EigenVectors (U) of Laplacian Matrix (L) S f = α U , so once we have U, we can get α and then we get f S We still need to work with NxN Matrix, at least we compute its Eigen vectors
18. 18. Eigen Function S Eigenfunction are limit of Eigenvectors as n  ∞ S For each dimension (2), S we calculate the Eigenvector by interpolating the Eigen function from the histogram of this dimension S Which takes a lot less than S Need more explanation 
19. 19. Eigen Function S Eigenfunction are limit of Eigenvectors as n  ∞ S Notice solution of Eigenfunction is based on the number of Dimensions, while Eigenfunction is based on number of Objects S Images Pixels as Object S Images with local features as dimension
20. 20. Application S Object Classification S Interactive Image segmentation S Image Segmentation
21. 21. Application Object Classification S Coil 20 Dataset S 20 Different Object S Each Object has 72 different pose
22. 22. Application Object Classification S Our Experiment S Label some of these Images S Both Positive and Negative Labels S Use the LSQ , EigenVector, EigenFunction to compute the labels of the Unlabled data
23. 23. Application Object Classification S Our Results LSQ Solution EigenVector Solution Eigenfunction Solution
24. 24. Application Object Classification S Results Analysis S LSQ solution is almost perfect since it is almost exact Solution S EigenVector generate approximate solution but in less time, which makes more sense it is just solving one NxN Matrix to get Eigen Vectors S Eigen Function method also generated an approximate solution but its time was worse
25. 25. Application Object Classification S Time Results Analysis
26. 26. Application Object Classification S Results Explanation S We have 4 (Object) * 36 (pose per object) so total of 144 Object so Matrix laplaican is of size 144 S Each Image has 128*128 (gray scale) pixel so total of 16384 , so each object have 16384 dimension S 144 Object vs 16384 dimension
27. 27. Application Object Classification S Results Explanation S 144 Object vs 16384 dimension S So it is expected that LSQ , EigenVector method to finish faster since Matrix L is not that big S While Eigen-function will take a long time to compute the Eigen-function for each dimension 16384
28. 28. Application Interactive Image Segmentation S Why it is called Interactive
29. 29. Application Interactive Image Segmentation S We now have 500x320 = 160000 Object S But each object have like 5 dimension (R,G,B,X,Y). S Eigen vectors vs Eigen functions ?
30. 30. Application Interactive Image Segmentation S Eigen vectors vs Eigen functions ?
31. 31. Application Interactive Image Segmentation S Eigen vectors vs Eigen functions ? S No Way LSQ or Eigen vectors can support such number of objects , Laplacian Matrix size is 160000 x 160000. S Eigen function method calculates Eigen vectors of 5 dimensions and we are ready to show some results
32. 32. Application Interactive Image Segmentation
33. 33. Application Interactive Image Segmentation S Notice why bears have distinctive colors from the rest of the image, but this is still in progress work. S It is not perfect yet
34. 34. Application S Non-Interactive segmentation S Fore-ground background segmentation S Co-segmentation S System held user to know where to add annotation
35. 35. Conclusion S It is better to use Eigen vectors when you have small set of objects with high dimension S It is better to use Eigen functions when you have big set of objects with small dimension
36. 36. Thanks 