SlideShare a Scribd company logo
1 of 22
Hashing with Graphs
Anchor Graph Hashing
                   ICML2011        2011 8   4


                         blog.beam2d.net @beam2d
Hashing with Graphs.
Liu, W., Wang, J., Kumar, S. and Chang, S.-F. ICML 2011.

                                                           2
Hashing
            d
        R




‣
    (   ,           )
‣
                          3
Hashing
        d
        R




‣           x∈R
                d
                       y ∈ {1, -1}
                                  r

    (         0,1      ±1         )
‣
    -                                 4
Hashing
         d
         R




‣                          (   etc.)
‣r
‣    ,                 r
                                       5
Hashing =               ○

‣
‣
    -
    -
        (                   )
        -                       →
    -
‣           :
    -               n
    -           n                   6
(xi)i=1,...,n     ,
Y = (Yik)i=1,...,n, k=1,...,r
‣ (1)
‣ (2)                                                  2
                                     Aij = exp(− xi − xj /t)
‣ (3)
                      n                         (     ,   )
                  1                2
            min           Yi − Yj Aij           Yi: Y   i
             Y    2
                    i,j=1
                                     (1)
                        n×r
            s.t. Y ∈ R      , 1 Y = 0, Y Y = nIr×r .
                                (2)      (3)
                                  (1 :         1           )
                                                           7
G       A                       , D = diag(A1)
 ,L=D-A   G



                 ,               A
             .
      n
 1                      2
             Y i − Yj       Aij = tr(Y LY).
 2
     i,j=1

                                                 8
L


     min tr(Y LY)
      Y
                          (2)         (3)
     s.t. Y ∈ Rn×r , 1 Y = 0, Y Y = nIr×r .


‣     (3)             ,L
              r               (cf.            )
‣                 1                  (2)
‣2            1       , (2)
‣2
                                                  9
‣A           L
‣                A

‣ m (≪ n)                 u1, …, um
         d
    ∈R
‣
    -
    -     n=69,000
        m=300
‣                      s
                     ( : s=2)         10
‣2

‣
    -
                                     n×m
                             Z∈R
        -
        -Z                       (     s/m)
                       h(xi ,uj )
                          h(xi ,uj ) , ∀j ∈ i
    Zij =            j ∈ i

                0,                         otherwise.
(    i   : xi                s                          , h:   )   11
‣2
                        ,


‣ Λ = diag(1 Z) ∈ R
            T     m×m
                            ,


          ^ = ZΛ−1 Z .
          A


‣
                                12
‣                        ^
                         A
    - (                              0)
    -            m
    -(  )                .                     ^
                                          L=I- A
            ,L   ^
                 A
        -                    (   )

                     ^
‣                    A

    -        ,
                                                   13
‣   ^ = ZΛ-1/2Λ-1/2ZT
    A
‣ M = Λ Z ZΛ
          -1/2 T    -1/2
                         ∈R
                           m×m


‣ ZΛ -1/2
           = UΣ
                 1/2 T
                     V :
                  n×m       m×m               m×m
    (      U∈R          Σ∈R       V∈R                )
‣
             ^ = UΣ1/2 V VΣ1/2 U = UΣU ,
             A
            M = VΣ1/2 U UΣ1/2 V         = VΣV .


‣                             U = ZΛ
                                       -1/2
                                              VΣ
                                                   -1/2

                                                          14
‣U            r                 Y
‣Σ                            1, σ1, …, σr, …
                                             m
  σ1, …, σr            V         v1, …, vr ∈ R
‣ Σr = diag(σ1, …, σr) Vr = [v1, …, vr]
‣           W
                  √ −1/2      −1/2      m×r
             W = nΛ       Vr Σr     ∈R

‣                  Y


                       Y = ZW.

                                                 15
Nyström method

‣                   ,
‣                         n→∞
    -
    -
‣                              ,n            k
                      φn, n
                          k
                      1     ^
          φn,k (x) =        A(x, xi )Yik .
                     σk
                         i=1


^
A ():  
                                                 16
AGH         Nyström

                      n
                  1         ^
      φn,k (x) =            A(x, xi )Yik .
                 σk
                      i=1

‣
‣
           φn,k (x) = wk z(x).               z: x


-
-       O(dm)

                                                    17
‣            :        *(              )
‣ 28 × 28 = 784
‣                (784    )
‣              n = 69,000
‣                1,000

‣


* http://yann.lecun.com/exdb/mnist/       18
(                    )




    m = 300, s = 2       19
‣
‣
‣


‣

‣

‣
    20
A. Andoni and P. Indyk. Near-optimal hashing algorithms for
approximate nearest neighbor in high dimensions. Proceedings of
FOCS, 2006.
Y. Bengio, O. Delalleau, N. Le Roux, and J.-F. Paiement. Learning
eigenfunctions links spectral embedding and kernel pca. Neural
Computation, 2004.
A. Gionis, P. Indyk, and R. Motwani. Similarity search in high
dimensions via hashing. Proceedings of VLDB, 1999.
P. Indyk and R. Motwani. Approximate nearest neighbor: Towards
removing the curse of dimensionality. Proceedings of STOC, 1998.
B. Kulis and T. Darrell. Learning to hash with binary reconstructive
embeddings. NIPS 22, 2010.
B. Kulis and K. Grauman. Kernelized locality-sensitive hashing for
scalable image search. Proceedings of ICCV, 2009.                      21
W. Liu, J. He, and S.-F. Chang. Large graph construction for scalable
semi-supervised learning. Proceedings of ICML, 2010.
W. Liu, J. Wang, S. Kumar, and S.-F. Chang. Hashing with graphs.
ICML, 2011.
M. Raginsky and S. Lazebnik. Locality-sensitive binary codes from
shift-invariant kernels. NIPS 22, 2010.
J. Wang, S. Kumar, and S.-F. Chang. Sequential projection learning for
hashing with compact codes. Proceedings of ICML, 2010.
Y. Weiss, A. Torralba, and R. Fergus. Spectral hashing. NIPS 21, 2009.
C. Williams and M. Seeger. The effect of the input density distribution
on kernel-based classifiers. Proceedings of ICML, 2000.


                                                                          22

More Related Content

What's hot

Gaussseidelsor
GaussseidelsorGaussseidelsor
Gaussseidelsor
uis
 
4. standard granger causality
4. standard granger causality4. standard granger causality
4. standard granger causality
Quang Hoang
 
7. toda yamamoto-granger causality
7. toda yamamoto-granger causality7. toda yamamoto-granger causality
7. toda yamamoto-granger causality
Quang Hoang
 
กลศาสตร์เพิ่มเติม Ppt
กลศาสตร์เพิ่มเติม Pptกลศาสตร์เพิ่มเติม Ppt
กลศาสตร์เพิ่มเติม Ppt
tuiye
 
11.[104 111]analytical solution for telegraph equation by modified of sumudu ...
11.[104 111]analytical solution for telegraph equation by modified of sumudu ...11.[104 111]analytical solution for telegraph equation by modified of sumudu ...
11.[104 111]analytical solution for telegraph equation by modified of sumudu ...
Alexander Decker
 
Introduction to inverse problems
Introduction to inverse problemsIntroduction to inverse problems
Introduction to inverse problems
Delta Pi Systems
 

What's hot (19)

Admissions in india 2015
Admissions in india 2015Admissions in india 2015
Admissions in india 2015
 
Regression Theory
Regression TheoryRegression Theory
Regression Theory
 
Cosmin Crucean: Perturbative QED on de Sitter Universe.
Cosmin Crucean: Perturbative QED on de Sitter Universe.Cosmin Crucean: Perturbative QED on de Sitter Universe.
Cosmin Crucean: Perturbative QED on de Sitter Universe.
 
Andreas Eberle
Andreas EberleAndreas Eberle
Andreas Eberle
 
Journey to structure from motion
Journey to structure from motionJourney to structure from motion
Journey to structure from motion
 
Gaussseidelsor
GaussseidelsorGaussseidelsor
Gaussseidelsor
 
Scientific Computing with Python Webinar 9/18/2009:Curve Fitting
Scientific Computing with Python Webinar 9/18/2009:Curve FittingScientific Computing with Python Webinar 9/18/2009:Curve Fitting
Scientific Computing with Python Webinar 9/18/2009:Curve Fitting
 
Signal Processing Course : Sparse Regularization of Inverse Problems
Signal Processing Course : Sparse Regularization of Inverse ProblemsSignal Processing Course : Sparse Regularization of Inverse Problems
Signal Processing Course : Sparse Regularization of Inverse Problems
 
4. standard granger causality
4. standard granger causality4. standard granger causality
4. standard granger causality
 
7. toda yamamoto-granger causality
7. toda yamamoto-granger causality7. toda yamamoto-granger causality
7. toda yamamoto-granger causality
 
กลศาสตร์เพิ่มเติม Ppt
กลศาสตร์เพิ่มเติม Pptกลศาสตร์เพิ่มเติม Ppt
กลศาสตร์เพิ่มเติม Ppt
 
Markov Tutorial CDC Shanghai 2009
Markov Tutorial CDC Shanghai 2009Markov Tutorial CDC Shanghai 2009
Markov Tutorial CDC Shanghai 2009
 
11.[104 111]analytical solution for telegraph equation by modified of sumudu ...
11.[104 111]analytical solution for telegraph equation by modified of sumudu ...11.[104 111]analytical solution for telegraph equation by modified of sumudu ...
11.[104 111]analytical solution for telegraph equation by modified of sumudu ...
 
Cheat Sheet
Cheat SheetCheat Sheet
Cheat Sheet
 
Computation of the marginal likelihood
Computation of the marginal likelihoodComputation of the marginal likelihood
Computation of the marginal likelihood
 
Introduction to inverse problems
Introduction to inverse problemsIntroduction to inverse problems
Introduction to inverse problems
 
Query Suggestion @ tokyotextmining#2
Query Suggestion @ tokyotextmining#2Query Suggestion @ tokyotextmining#2
Query Suggestion @ tokyotextmining#2
 
Montpellier Math Colloquium
Montpellier Math ColloquiumMontpellier Math Colloquium
Montpellier Math Colloquium
 
iTute Notes MM
iTute Notes MMiTute Notes MM
iTute Notes MM
 

Viewers also liked

Viewers also liked (20)

Ml4nlp 4 2
Ml4nlp 4 2Ml4nlp 4 2
Ml4nlp 4 2
 
ICML2013読み会 Local Deep Kernel Learning for Efficient Non-linear SVM Prediction
ICML2013読み会 Local Deep Kernel Learning for Efficient Non-linear SVM PredictionICML2013読み会 Local Deep Kernel Learning for Efficient Non-linear SVM Prediction
ICML2013読み会 Local Deep Kernel Learning for Efficient Non-linear SVM Prediction
 
Overview of Chainer and Its Features
Overview of Chainer and Its FeaturesOverview of Chainer and Its Features
Overview of Chainer and Its Features
 
論文紹介 Compressing Neural Networks with the Hashing Trick
論文紹介 Compressing Neural Networks with the Hashing Trick論文紹介 Compressing Neural Networks with the Hashing Trick
論文紹介 Compressing Neural Networks with the Hashing Trick
 
深層学習フレームワークChainerの紹介とFPGAへの期待
深層学習フレームワークChainerの紹介とFPGAへの期待深層学習フレームワークChainerの紹介とFPGAへの期待
深層学習フレームワークChainerの紹介とFPGAへの期待
 
Introduction to Chainer
Introduction to ChainerIntroduction to Chainer
Introduction to Chainer
 
Differences of Deep Learning Frameworks
Differences of Deep Learning FrameworksDifferences of Deep Learning Frameworks
Differences of Deep Learning Frameworks
 
NIPS2013読み会 DeViSE: A Deep Visual-Semantic Embedding Model
NIPS2013読み会 DeViSE: A Deep Visual-Semantic Embedding ModelNIPS2013読み会 DeViSE: A Deep Visual-Semantic Embedding Model
NIPS2013読み会 DeViSE: A Deep Visual-Semantic Embedding Model
 
Introduction to Chainer: A Flexible Framework for Deep Learning
Introduction to Chainer: A Flexible Framework for Deep LearningIntroduction to Chainer: A Flexible Framework for Deep Learning
Introduction to Chainer: A Flexible Framework for Deep Learning
 
Deep Learningの技術と未来
Deep Learningの技術と未来Deep Learningの技術と未来
Deep Learningの技術と未来
 
Learning stochastic neural networks with Chainer
Learning stochastic neural networks with ChainerLearning stochastic neural networks with Chainer
Learning stochastic neural networks with Chainer
 
Recurrent Neural Networks
Recurrent Neural NetworksRecurrent Neural Networks
Recurrent Neural Networks
 
Deep Learningの基礎と応用
Deep Learningの基礎と応用Deep Learningの基礎と応用
Deep Learningの基礎と応用
 
Towards Chainer v1.5
Towards Chainer v1.5Towards Chainer v1.5
Towards Chainer v1.5
 
Deep learning実装の基礎と実践
Deep learning実装の基礎と実践Deep learning実装の基礎と実践
Deep learning実装の基礎と実践
 
生成モデルの Deep Learning
生成モデルの Deep Learning生成モデルの Deep Learning
生成モデルの Deep Learning
 
Deep Learning技術の今
Deep Learning技術の今Deep Learning技術の今
Deep Learning技術の今
 
論文紹介 Pixel Recurrent Neural Networks
論文紹介 Pixel Recurrent Neural Networks論文紹介 Pixel Recurrent Neural Networks
論文紹介 Pixel Recurrent Neural Networks
 
深層学習フレームワーク Chainer の開発と今後の展開
深層学習フレームワーク Chainer の開発と今後の展開深層学習フレームワーク Chainer の開発と今後の展開
深層学習フレームワーク Chainer の開発と今後の展開
 
Chainer Development Plan 2015/12
Chainer Development Plan 2015/12Chainer Development Plan 2015/12
Chainer Development Plan 2015/12
 

Similar to Tprimal agh

2 senarai rumus add maths k2 trial spm sbp 2010
2 senarai rumus add maths k2 trial spm sbp 20102 senarai rumus add maths k2 trial spm sbp 2010
2 senarai rumus add maths k2 trial spm sbp 2010
zabidah awang
 
2 senarai rumus add maths k1 trial spm sbp 2010
2 senarai rumus add maths k1 trial spm sbp 20102 senarai rumus add maths k1 trial spm sbp 2010
2 senarai rumus add maths k1 trial spm sbp 2010
zabidah awang
 
2 senarai rumus add maths k1 trial spm sbp 2010
2 senarai rumus add maths k1 trial spm sbp 20102 senarai rumus add maths k1 trial spm sbp 2010
2 senarai rumus add maths k1 trial spm sbp 2010
zabidah awang
 
2 senarai rumus add maths k2 trial spm sbp 2010
2 senarai rumus add maths k2 trial spm sbp 20102 senarai rumus add maths k2 trial spm sbp 2010
2 senarai rumus add maths k2 trial spm sbp 2010
zabidah awang
 
Jyokyo-kai-20120605
Jyokyo-kai-20120605Jyokyo-kai-20120605
Jyokyo-kai-20120605
ketanaka
 
Ysu conference presentation alaverdyan
Ysu conference  presentation alaverdyanYsu conference  presentation alaverdyan
Ysu conference presentation alaverdyan
Grigor Alaverdyan
 

Similar to Tprimal agh (20)

rinko2011-agh
rinko2011-aghrinko2011-agh
rinko2011-agh
 
test
testtest
test
 
Matrix calculus
Matrix calculusMatrix calculus
Matrix calculus
 
2 senarai rumus add maths k2 trial spm sbp 2010
2 senarai rumus add maths k2 trial spm sbp 20102 senarai rumus add maths k2 trial spm sbp 2010
2 senarai rumus add maths k2 trial spm sbp 2010
 
2 senarai rumus add maths k1 trial spm sbp 2010
2 senarai rumus add maths k1 trial spm sbp 20102 senarai rumus add maths k1 trial spm sbp 2010
2 senarai rumus add maths k1 trial spm sbp 2010
 
2 senarai rumus add maths k1 trial spm sbp 2010
2 senarai rumus add maths k1 trial spm sbp 20102 senarai rumus add maths k1 trial spm sbp 2010
2 senarai rumus add maths k1 trial spm sbp 2010
 
2 senarai rumus add maths k2 trial spm sbp 2010
2 senarai rumus add maths k2 trial spm sbp 20102 senarai rumus add maths k2 trial spm sbp 2010
2 senarai rumus add maths k2 trial spm sbp 2010
 
Dual Gravitons in AdS4/CFT3 and the Holographic Cotton Tensor
Dual Gravitons in AdS4/CFT3 and the Holographic Cotton TensorDual Gravitons in AdS4/CFT3 and the Holographic Cotton Tensor
Dual Gravitons in AdS4/CFT3 and the Holographic Cotton Tensor
 
Jyokyo-kai-20120605
Jyokyo-kai-20120605Jyokyo-kai-20120605
Jyokyo-kai-20120605
 
Holographic Cotton Tensor
Holographic Cotton TensorHolographic Cotton Tensor
Holographic Cotton Tensor
 
Ysu conference presentation alaverdyan
Ysu conference  presentation alaverdyanYsu conference  presentation alaverdyan
Ysu conference presentation alaverdyan
 
Prediction of Financial Processes
Prediction of Financial ProcessesPrediction of Financial Processes
Prediction of Financial Processes
 
Econometric lec3.ppt
Econometric  lec3.pptEconometric  lec3.ppt
Econometric lec3.ppt
 
Additional notes EC220
Additional notes EC220Additional notes EC220
Additional notes EC220
 
Singularities in the one control problem. S.I.S.S.A., Trieste August 16, 2007.
Singularities in the one control problem. S.I.S.S.A., Trieste August 16, 2007.Singularities in the one control problem. S.I.S.S.A., Trieste August 16, 2007.
Singularities in the one control problem. S.I.S.S.A., Trieste August 16, 2007.
 
Quantitative norm convergence of some ergodic averages
Quantitative norm convergence of some ergodic averagesQuantitative norm convergence of some ergodic averages
Quantitative norm convergence of some ergodic averages
 
What happens when the Kolmogorov-Zakharov spectrum is nonlocal?
What happens when the Kolmogorov-Zakharov spectrum is nonlocal?What happens when the Kolmogorov-Zakharov spectrum is nonlocal?
What happens when the Kolmogorov-Zakharov spectrum is nonlocal?
 
Question 6
Question 6 Question 6
Question 6
 
S 7
S 7S 7
S 7
 
Solution manual 13 15
Solution manual 13 15Solution manual 13 15
Solution manual 13 15
 

More from Seiya Tokui (7)

Chainer/CuPy v5 and Future (Japanese)
Chainer/CuPy v5 and Future (Japanese)Chainer/CuPy v5 and Future (Japanese)
Chainer/CuPy v5 and Future (Japanese)
 
Chainer v3
Chainer v3Chainer v3
Chainer v3
 
Chainer v2 and future dev plan
Chainer v2 and future dev planChainer v2 and future dev plan
Chainer v2 and future dev plan
 
Chainer v2 alpha
Chainer v2 alphaChainer v2 alpha
Chainer v2 alpha
 
Chainer Update v1.8.0 -> v1.10.0+
Chainer Update v1.8.0 -> v1.10.0+Chainer Update v1.8.0 -> v1.10.0+
Chainer Update v1.8.0 -> v1.10.0+
 
Chainerの使い方と自然言語処理への応用
Chainerの使い方と自然言語処理への応用Chainerの使い方と自然言語処理への応用
Chainerの使い方と自然言語処理への応用
 
論文紹介 Semi-supervised Learning with Deep Generative Models
論文紹介 Semi-supervised Learning with Deep Generative Models論文紹介 Semi-supervised Learning with Deep Generative Models
論文紹介 Semi-supervised Learning with Deep Generative Models
 

Recently uploaded

Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Safe Software
 

Recently uploaded (20)

DBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor PresentationDBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor Presentation
 
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin WoodPolkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
 
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, AdobeApidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
 
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
 
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWEREMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
 
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data DiscoveryTrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
 
Apidays Singapore 2024 - Modernizing Securities Finance by Madhu Subbu
Apidays Singapore 2024 - Modernizing Securities Finance by Madhu SubbuApidays Singapore 2024 - Modernizing Securities Finance by Madhu Subbu
Apidays Singapore 2024 - Modernizing Securities Finance by Madhu Subbu
 
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
 
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
 
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
 
ICT role in 21st century education and its challenges
ICT role in 21st century education and its challengesICT role in 21st century education and its challenges
ICT role in 21st century education and its challenges
 
Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt Robison
 
Ransomware_Q4_2023. The report. [EN].pdf
Ransomware_Q4_2023. The report. [EN].pdfRansomware_Q4_2023. The report. [EN].pdf
Ransomware_Q4_2023. The report. [EN].pdf
 
Automating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps ScriptAutomating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps Script
 
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
 
Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024
 
A Beginners Guide to Building a RAG App Using Open Source Milvus
A Beginners Guide to Building a RAG App Using Open Source MilvusA Beginners Guide to Building a RAG App Using Open Source Milvus
A Beginners Guide to Building a RAG App Using Open Source Milvus
 
FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024
 

Tprimal agh

  • 1. Hashing with Graphs Anchor Graph Hashing ICML2011 2011 8 4 blog.beam2d.net @beam2d
  • 2. Hashing with Graphs. Liu, W., Wang, J., Kumar, S. and Chang, S.-F. ICML 2011. 2
  • 3. Hashing d R ‣ ( , ) ‣ 3
  • 4. Hashing d R ‣ x∈R d y ∈ {1, -1} r ( 0,1 ±1 ) ‣ - 4
  • 5. Hashing d R ‣ ( etc.) ‣r ‣ , r 5
  • 6. Hashing = ○ ‣ ‣ - - ( ) - → - ‣ : - n - n 6
  • 7. (xi)i=1,...,n , Y = (Yik)i=1,...,n, k=1,...,r ‣ (1) ‣ (2) 2 Aij = exp(− xi − xj /t) ‣ (3) n ( , ) 1 2 min Yi − Yj Aij Yi: Y i Y 2 i,j=1 (1) n×r s.t. Y ∈ R , 1 Y = 0, Y Y = nIr×r . (2) (3) (1 : 1 ) 7
  • 8. G A , D = diag(A1) ,L=D-A G , A . n 1 2 Y i − Yj Aij = tr(Y LY). 2 i,j=1 8
  • 9. L min tr(Y LY) Y (2) (3) s.t. Y ∈ Rn×r , 1 Y = 0, Y Y = nIr×r . ‣ (3) ,L r (cf. ) ‣ 1 (2) ‣2 1 , (2) ‣2 9
  • 10. ‣A L ‣ A ‣ m (≪ n) u1, …, um d ∈R ‣ - - n=69,000 m=300 ‣ s ( : s=2) 10
  • 11. ‣2 ‣ - n×m Z∈R - -Z ( s/m) h(xi ,uj ) h(xi ,uj ) , ∀j ∈ i Zij = j ∈ i 0, otherwise. ( i : xi s , h: ) 11
  • 12. ‣2 , ‣ Λ = diag(1 Z) ∈ R T m×m , ^ = ZΛ−1 Z . A ‣ 12
  • 13. ^ A - ( 0) - m -( ) . ^ L=I- A ,L ^ A - ( ) ^ ‣ A - , 13
  • 14. ^ = ZΛ-1/2Λ-1/2ZT A ‣ M = Λ Z ZΛ -1/2 T -1/2 ∈R m×m ‣ ZΛ -1/2 = UΣ 1/2 T V : n×m m×m m×m ( U∈R Σ∈R V∈R ) ‣ ^ = UΣ1/2 V VΣ1/2 U = UΣU , A M = VΣ1/2 U UΣ1/2 V = VΣV . ‣ U = ZΛ -1/2 VΣ -1/2 14 ‣U r Y
  • 15. ‣Σ 1, σ1, …, σr, … m σ1, …, σr V v1, …, vr ∈ R ‣ Σr = diag(σ1, …, σr) Vr = [v1, …, vr] ‣ W √ −1/2 −1/2 m×r W = nΛ Vr Σr ∈R ‣ Y Y = ZW. 15
  • 16. Nyström method ‣ , ‣ n→∞ - - ‣ ,n k φn, n k 1 ^ φn,k (x) = A(x, xi )Yik . σk i=1 ^ A ():   16
  • 17. AGH Nyström n 1 ^ φn,k (x) = A(x, xi )Yik . σk i=1 ‣ ‣ φn,k (x) = wk z(x). z: x - - O(dm) 17
  • 18. : *( ) ‣ 28 × 28 = 784 ‣ (784 ) ‣ n = 69,000 ‣ 1,000 ‣ * http://yann.lecun.com/exdb/mnist/ 18
  • 19. ( ) m = 300, s = 2 19
  • 21. A. Andoni and P. Indyk. Near-optimal hashing algorithms for approximate nearest neighbor in high dimensions. Proceedings of FOCS, 2006. Y. Bengio, O. Delalleau, N. Le Roux, and J.-F. Paiement. Learning eigenfunctions links spectral embedding and kernel pca. Neural Computation, 2004. A. Gionis, P. Indyk, and R. Motwani. Similarity search in high dimensions via hashing. Proceedings of VLDB, 1999. P. Indyk and R. Motwani. Approximate nearest neighbor: Towards removing the curse of dimensionality. Proceedings of STOC, 1998. B. Kulis and T. Darrell. Learning to hash with binary reconstructive embeddings. NIPS 22, 2010. B. Kulis and K. Grauman. Kernelized locality-sensitive hashing for scalable image search. Proceedings of ICCV, 2009. 21
  • 22. W. Liu, J. He, and S.-F. Chang. Large graph construction for scalable semi-supervised learning. Proceedings of ICML, 2010. W. Liu, J. Wang, S. Kumar, and S.-F. Chang. Hashing with graphs. ICML, 2011. M. Raginsky and S. Lazebnik. Locality-sensitive binary codes from shift-invariant kernels. NIPS 22, 2010. J. Wang, S. Kumar, and S.-F. Chang. Sequential projection learning for hashing with compact codes. Proceedings of ICML, 2010. Y. Weiss, A. Torralba, and R. Fergus. Spectral hashing. NIPS 21, 2009. C. Williams and M. Seeger. The effect of the input density distribution on kernel-based classifiers. Proceedings of ICML, 2000. 22

Editor's Notes

  1. \n
  2. 動機:\n 大量のデータを高速・省メモリで扱うニーズがどんどん高まってる\n ハッシングは実用志向の教師なし学習として重要\n
  3. \n
  4. \n
  5. \n
  6. \n
  7. SH も同じ問題\n
  8. G のノード上で定義された関数に作用する,離散的なラプラス作用素,みたいなもの\n
  9. 1 は均等条件を満たさない\n
  10. 熱核\n
  11. 熱核\n
  12. 熱核\n
  13. 熱核\n
  14. さらっと進める\n
  15. √n は YT Y=nI のためにある\n
  16. √n は YT Y=nI のためにある\n
  17. O(dm) の d は,疎ベクトルなら非ゼロ成分数で済む\n
  18. [Liu+, 11] ではもう一つ別のデータセットでも実験している\n
  19. MAP: Mean Average Precision\n ランキングの上から見て,正しいラベルがつけられた k 番目のデータを n 位にしていたときにそのデータのスコアを k/n として,平均をとったもの\n
  20. \n
  21. \n
  22. \n