Successfully reported this slideshow.

# Tprimal agh

Upcoming SlideShare
rinko2010
×

# Tprimal agh

## More Related Content

### Tprimal agh

1. 1. Hashing with Graphs Anchor Graph Hashing ICML2011 2011 8 4 blog.beam2d.net @beam2d
2. 2. Hashing with Graphs. Liu, W., Wang, J., Kumar, S. and Chang, S.-F. ICML 2011. 2
3. 3. Hashing d R ‣ ( , ) ‣ 3
4. 4. Hashing d R ‣ x∈R d y ∈ {1, -1} r ( 0,1 ±1 ) ‣ - 4
5. 5. Hashing d R ‣ ( etc.) ‣r ‣ , r 5
6. 6. Hashing = ○ ‣ ‣ - - ( ) - → - ‣ : - n - n 6
7. 7. (xi)i=1,...,n , Y = (Yik)i=1,...,n, k=1,...,r ‣ (1) ‣ (2) 2 Aij = exp(− xi − xj /t) ‣ (3) n ( , ) 1 2 min Yi − Yj Aij Yi: Y i Y 2 i,j=1 (1) n×r s.t. Y ∈ R , 1 Y = 0, Y Y = nIr×r . (2) (3) (1 : 1 ) 7
8. 8. G A , D = diag(A1) ,L=D-A G , A . n 1 2 Y i − Yj Aij = tr(Y LY). 2 i,j=1 8
9. 9. L min tr(Y LY) Y (2) (3) s.t. Y ∈ Rn×r , 1 Y = 0, Y Y = nIr×r . ‣ (3) ,L r (cf. ) ‣ 1 (2) ‣2 1 , (2) ‣2 9
10. 10. ‣A L ‣ A ‣ m (≪ n) u1, …, um d ∈R ‣ - - n=69,000 m=300 ‣ s ( : s=2) 10
11. 11. ‣2 ‣ - n×m Z∈R - -Z ( s/m) h(xi ,uj ) h(xi ,uj ) , ∀j ∈ i Zij = j ∈ i 0, otherwise. ( i : xi s , h: ) 11
12. 12. ‣2 , ‣ Λ = diag(1 Z) ∈ R T m×m , ^ = ZΛ−1 Z . A ‣ 12
13. 13. ‣ ^ A - ( 0) - m -( ) . ^ L=I- A ,L ^ A - ( ) ^ ‣ A - , 13
14. 14. ‣ ^ = ZΛ-1/2Λ-1/2ZT A ‣ M = Λ Z ZΛ -1/2 T -1/2 ∈R m×m ‣ ZΛ -1/2 = UΣ 1/2 T V : n×m m×m m×m ( U∈R Σ∈R V∈R ) ‣ ^ = UΣ1/2 V VΣ1/2 U = UΣU , A M = VΣ1/2 U UΣ1/2 V = VΣV . ‣ U = ZΛ -1/2 VΣ -1/2 14 ‣U r Y
15. 15. ‣Σ 1, σ1, …, σr, … m σ1, …, σr V v1, …, vr ∈ R ‣ Σr = diag(σ1, …, σr) Vr = [v1, …, vr] ‣ W √ −1/2 −1/2 m×r W = nΛ Vr Σr ∈R ‣ Y Y = ZW. 15
16. 16. Nyström method ‣ , ‣ n→∞ - - ‣ ,n k φn, n k 1 ^ φn,k (x) = A(x, xi )Yik . σk i=1 ^ A ():   16
17. 17. AGH Nyström n 1 ^ φn,k (x) = A(x, xi )Yik . σk i=1 ‣ ‣ φn,k (x) = wk z(x). z: x - - O(dm) 17
18. 18. ‣ : *( ) ‣ 28 × 28 = 784 ‣ (784 ) ‣ n = 69,000 ‣ 1,000 ‣ * http://yann.lecun.com/exdb/mnist/ 18
19. 19. ( ) m = 300, s = 2 19
20. 20. ‣ ‣ ‣ ‣ ‣ ‣ 20
21. 21. A. Andoni and P. Indyk. Near-optimal hashing algorithms for approximate nearest neighbor in high dimensions. Proceedings of FOCS, 2006. Y. Bengio, O. Delalleau, N. Le Roux, and J.-F. Paiement. Learning eigenfunctions links spectral embedding and kernel pca. Neural Computation, 2004. A. Gionis, P. Indyk, and R. Motwani. Similarity search in high dimensions via hashing. Proceedings of VLDB, 1999. P. Indyk and R. Motwani. Approximate nearest neighbor: Towards removing the curse of dimensionality. Proceedings of STOC, 1998. B. Kulis and T. Darrell. Learning to hash with binary reconstructive embeddings. NIPS 22, 2010. B. Kulis and K. Grauman. Kernelized locality-sensitive hashing for scalable image search. Proceedings of ICCV, 2009. 21
22. 22. W. Liu, J. He, and S.-F. Chang. Large graph construction for scalable semi-supervised learning. Proceedings of ICML, 2010. W. Liu, J. Wang, S. Kumar, and S.-F. Chang. Hashing with graphs. ICML, 2011. M. Raginsky and S. Lazebnik. Locality-sensitive binary codes from shift-invariant kernels. NIPS 22, 2010. J. Wang, S. Kumar, and S.-F. Chang. Sequential projection learning for hashing with compact codes. Proceedings of ICML, 2010. Y. Weiss, A. Torralba, and R. Fergus. Spectral hashing. NIPS 21, 2009. C. Williams and M. Seeger. The effect of the input density distribution on kernel-based classifiers. Proceedings of ICML, 2000. 22

### Editor's Notes

• \n
• \n
• \n
• \n
• \n
• SH &amp;#x3082;&amp;#x540C;&amp;#x3058;&amp;#x554F;&amp;#x984C;\n
• G &amp;#x306E;&amp;#x30CE;&amp;#x30FC;&amp;#x30C9;&amp;#x4E0A;&amp;#x3067;&amp;#x5B9A;&amp;#x7FA9;&amp;#x3055;&amp;#x308C;&amp;#x305F;&amp;#x95A2;&amp;#x6570;&amp;#x306B;&amp;#x4F5C;&amp;#x7528;&amp;#x3059;&amp;#x308B;&amp;#xFF0C;&amp;#x96E2;&amp;#x6563;&amp;#x7684;&amp;#x306A;&amp;#x30E9;&amp;#x30D7;&amp;#x30E9;&amp;#x30B9;&amp;#x4F5C;&amp;#x7528;&amp;#x7D20;&amp;#xFF0C;&amp;#x307F;&amp;#x305F;&amp;#x3044;&amp;#x306A;&amp;#x3082;&amp;#x306E;\n
• 1 &amp;#x306F;&amp;#x5747;&amp;#x7B49;&amp;#x6761;&amp;#x4EF6;&amp;#x3092;&amp;#x6E80;&amp;#x305F;&amp;#x3055;&amp;#x306A;&amp;#x3044;\n
• &amp;#x71B1;&amp;#x6838;\n
• &amp;#x71B1;&amp;#x6838;\n
• &amp;#x71B1;&amp;#x6838;\n
• &amp;#x71B1;&amp;#x6838;\n
• &amp;#x3055;&amp;#x3089;&amp;#x3063;&amp;#x3068;&amp;#x9032;&amp;#x3081;&amp;#x308B;\n
• &amp;#x221A;n &amp;#x306F; YT Y=nI &amp;#x306E;&amp;#x305F;&amp;#x3081;&amp;#x306B;&amp;#x3042;&amp;#x308B;\n
• &amp;#x221A;n &amp;#x306F; YT Y=nI &amp;#x306E;&amp;#x305F;&amp;#x3081;&amp;#x306B;&amp;#x3042;&amp;#x308B;\n
• O(dm) &amp;#x306E; d &amp;#x306F;&amp;#xFF0C;&amp;#x758E;&amp;#x30D9;&amp;#x30AF;&amp;#x30C8;&amp;#x30EB;&amp;#x306A;&amp;#x3089;&amp;#x975E;&amp;#x30BC;&amp;#x30ED;&amp;#x6210;&amp;#x5206;&amp;#x6570;&amp;#x3067;&amp;#x6E08;&amp;#x3080;\n
• [Liu+, 11] &amp;#x3067;&amp;#x306F;&amp;#x3082;&amp;#x3046;&amp;#x4E00;&amp;#x3064;&amp;#x5225;&amp;#x306E;&amp;#x30C7;&amp;#x30FC;&amp;#x30BF;&amp;#x30BB;&amp;#x30C3;&amp;#x30C8;&amp;#x3067;&amp;#x3082;&amp;#x5B9F;&amp;#x9A13;&amp;#x3057;&amp;#x3066;&amp;#x3044;&amp;#x308B;\n
• MAP: Mean Average Precision\n &amp;#x30E9;&amp;#x30F3;&amp;#x30AD;&amp;#x30F3;&amp;#x30B0;&amp;#x306E;&amp;#x4E0A;&amp;#x304B;&amp;#x3089;&amp;#x898B;&amp;#x3066;&amp;#xFF0C;&amp;#x6B63;&amp;#x3057;&amp;#x3044;&amp;#x30E9;&amp;#x30D9;&amp;#x30EB;&amp;#x304C;&amp;#x3064;&amp;#x3051;&amp;#x3089;&amp;#x308C;&amp;#x305F; k &amp;#x756A;&amp;#x76EE;&amp;#x306E;&amp;#x30C7;&amp;#x30FC;&amp;#x30BF;&amp;#x3092; n &amp;#x4F4D;&amp;#x306B;&amp;#x3057;&amp;#x3066;&amp;#x3044;&amp;#x305F;&amp;#x3068;&amp;#x304D;&amp;#x306B;&amp;#x305D;&amp;#x306E;&amp;#x30C7;&amp;#x30FC;&amp;#x30BF;&amp;#x306E;&amp;#x30B9;&amp;#x30B3;&amp;#x30A2;&amp;#x3092; k/n &amp;#x3068;&amp;#x3057;&amp;#x3066;&amp;#xFF0C;&amp;#x5E73;&amp;#x5747;&amp;#x3092;&amp;#x3068;&amp;#x3063;&amp;#x305F;&amp;#x3082;&amp;#x306E;\n
• \n
• \n
• \n