• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Building Compatible Bases on Graphs, Images, and Manifolds
 

Building Compatible Bases on Graphs, Images, and Manifolds

on

  • 36 views

Spectral methods are used in computer graphics, machine learning, and computer vision, where many important problems boil down to constructing a Laplacian operator and finding its eigenvalues and ...

Spectral methods are used in computer graphics, machine learning, and computer vision, where many important problems boil down to constructing a Laplacian operator and finding its eigenvalues and eigenfunctions. We show how to generalize spectral geometry to multiple data spaces. Our construction is based on the idea of simultaneous diagonalization of Laplacian operators. We describe this problem and discuss numerical methods for its solution. We provide several synthetic and real examples of manifold learning, object classification, and clustering, showing that the joint spectral geometry better captures the inherent structure of multi-modal data.

Talk at SIAM-IS 2014 (http://www.math.hkbu.edu.hk/SIAM-IS14/). A big thanks to Michael Bronstein for providing a great set of slides this presentation is a mere extension of.

Statistics

Views

Total Views
36
Views on SlideShare
36
Embed Views
0

Actions

Likes
0
Downloads
0
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Building Compatible Bases on Graphs, Images, and Manifolds Building Compatible Bases on Graphs, Images, and Manifolds Presentation Transcript

    • Building Compatible Bases on Graphs, Images, and Manifolds Davide Eynard Institute of Computational Science, Faculty of Informatics University of Lugano, Switzerland SIAM-IS, 14 May 2014 Based on joint works with Artiom Kovnatsky, Michael M. Bronstein, Klaus Glashoff, and Alexander M. Bronstein 1 / 85
    • Ambiguous data Cayenne 2 / 85
    • Ambiguous data Cayenne City in Guiana 3 / 85
    • Ambiguous data Cayenne City in Guiana Pepper 4 / 85
    • Ambiguous data Cayenne City in Guiana Pepper Porsche car 5 / 85
    • Multimodal data analysis Cayenne, Porsche, car, automobile, SUV,... Chili, pepper, red, hot, food, plant, spice,... San Francisco, city, USA, California, hill,... Landrover, SUV, car, Jeep, 4x4, terrain,... Cayenne, city, Guiana, America, ocean,... Cayenne, pepper, hot, plant, spice, red,... 6 / 85
    • Multimodal data analysis Chili, food San Francisco, USA Landrover, SUV Cayenne, city Cayenne, Porsche Cayenne, pepper Image space Tag space 7 / 85
    • Multimodal data analysis Chili, food San Francisco, USA Landrover, SUV Cayenne, city Cayenne, Porsche Cayenne, pepper Image space Tag space 8 / 85
    • Multimodal data analysis Chili, food San Francisco, USA Landrover, SUV Cayenne, city Cayenne, Porsche Cayenne, pepper Image space Tag space 9 / 85
    • Discrete manifolds xi Graph (X, E) Discrete set of n vertices X = {x1, . . . , xn} 10 / 85
    • Discrete manifolds xj wij xi Graph (X, E) Discrete set of n vertices X = {x1, . . . , xn} Gaussian edge weight wij = e− xi−xj 2 2σ2 (i, j) ∈ E 0 else 11 / 85
    • Discrete manifolds xi Graph (X, E) Discrete set of n vertices X = {x1, . . . , xn} Gaussian edge weight wij = e− xi−xj 2 2σ2 (i, j) ∈ E 0 else Unnormalized Laplacian operator L = D − W D = diag( j=i wij) (vertex weight) 12 / 85
    • Discrete manifolds xi Graph (X, E) Discrete set of n vertices X = {x1, . . . , xn} Gaussian edge weight wij = e− xi−xj 2 2σ2 (i, j) ∈ E 0 else Unnormalized Laplacian operator L = D − W D = diag( j=i wij) (vertex weight) Symmetric normalized Laplacian Lsym = D−1/2 LD−1/2 13 / 85
    • Laplacian eigenvalues and eigenfunctions Eigenvalue problem: LΦ = ΦΛ Λ = diag(λ1, . . . , λn) are the eigenvalues satisfying 0 = λ1 ≤ λ2 ≤ . . . λn Φ = (φ1, . . . , φn) are the orthonormal eigenfunctions 14 / 85
    • Spectral geometry Laplacian eigenmap: m-dimensional embedding of X U = argmin U∈Rn×m tr (UT LU) s.t. UT U = I Belkin, Niyogi 2001 15 / 85
    • Spectral geometry Laplacian eigenmap: m-dimensional embedding of X using the first eigenvectors of the Laplacian U = (φ1, . . . , φm) Belkin, Niyogi 2001 16 / 85
    • Heat equation Heat diffusion on X is governed by the heat equation Lf(t) + ∂ ∂t f(t) = 0, f(0) = u, where f(t) is the amount of heat at time t 17 / 85
    • Heat equation Heat diffusion on X is governed by the heat equation Lf(t) + ∂ ∂t f(t) = 0, f(0) = u, where f(t) is the amount of heat at time t 18 / 85
    • Heat equation Heat diffusion on X is governed by the heat equation Lf(t) + ∂ ∂t f(t) = 0, f(0) = u, where f(t) is the amount of heat at time t Heat operator (or heat kernel) Ht = e−tL = Φe−tΛ ΦT provides the solution of the heat equation f(t) = Ht f(0) 19 / 85
    • Heat equation Heat diffusion on X is governed by the heat equation Lf(t) + ∂ ∂t f(t) = 0, f(0) = u, where f(t) is the amount of heat at time t Heat operator (or heat kernel) Ht = e−tL = Φe−tΛ ΦT provides the solution of the heat equation f(t) = Ht f(0) ‘How much heat is transferred from point xi to point xj in time t’ 20 / 85
    • Spectral geometry Diffusion map: m-dimensional embedding of X using the heat kernel U = (e−tλ1 φ1, . . . , e−tλm φm) B´erard et al. 1994; Coifman, Lafon 2006 21 / 85
    • Spectral geometry Diffusion distance: crosstalk between heat kernels d2 t (xp, xq) = n i=1 ((Ht )pi − (Ht )qi)2 B´erard et al. 1994; Coifman, Lafon 2006 22 / 85
    • Spectral geometry Diffusion distance: crosstalk between heat kernels d2 t (xp, xq) = n i=1 ((Ht )pi − (Ht )qi)2 = n i=1 e−2tλi (φpi − φqi)2 B´erard et al. 1994; Coifman, Lafon 2006 23 / 85
    • Spectral geometry Diffusion distance: Euclidean distance in the diffusion map space dt(xp, xq) = Up − Uq 2 B´erard et al. 1994; Coifman, Lafon 2006 24 / 85
    • Spectral geometry Diffusion distance: Euclidean distance in the diffusion map space dt(xp, xq) = Up − Uq 2 B´erard et al. 1994; Coifman, Lafon 2006 25 / 85
    • Spectral geometry K-means Spectral clustering: instead of applying K-means clustering the original data space... Ng et al. 2001 26 / 85
    • Spectral geometry K-means Spectral clustering: instead of applying K-means clustering the original data space, apply it in the Laplacian eigenspace Ng et al. 2001 27 / 85
    • Spectral clustering Unimodal Ng et al. 2001 28 / 85
    • Spectral clustering Unimodal Ng et al. 2001 ; Eynard, Bronstein2 , Glashoff 2012 29 / 85
    • Multimodal spectral clustering Unimodal Modality 1 Modality 2 Ng et al. 2001 ; Eynard, Bronstein2 , Glashoff 2012 30 / 85
    • Diagonalization of the Laplacian Eigendecomposition can be posed as the minimization problem min ΦT Φ=I off(ΦT LΦ) with off-diagonality penalty off(X) = i=j x2 ij. 31 / 85
    • Diagonalization of the Laplacian Eigendecomposition can be posed as the minimization problem min ΦT Φ=I off(ΦT LΦ) with off-diagonality penalty off(X) = i=j x2 ij. Jacobi iteration: compose Φ = · · · R3R2R1 as a sequence of Givens rotations, where each new rotation tries to reduce the off-diagonal terms Jacobi 1846 32 / 85
    • Diagonalization of the Laplacian Eigendecomposition can be posed as the minimization problem min ΦT Φ=I off(ΦT LΦ) with off-diagonality penalty off(X) = i=j x2 ij. Jacobi iteration: compose Φ = · · · R3R2R1 as a sequence of Givens rotations, where each new rotation tries to reduce the off-diagonal terms Analytic expression for optimal rotation for given pivot Jacobi 1846 33 / 85
    • Diagonalization of the Laplacian Eigendecomposition can be posed as the minimization problem min ΦT Φ=I off(ΦT LΦ) with off-diagonality penalty off(X) = i=j x2 ij. Jacobi iteration: compose Φ = · · · R3R2R1 as a sequence of Givens rotations, where each new rotation tries to reduce the off-diagonal terms Analytic expression for optimal rotation for given pivot Rotation applied in place – no matrix multiplication Jacobi 1846 34 / 85
    • Diagonalization of the Laplacian Eigendecomposition can be posed as the minimization problem min ΦT Φ=I off(ΦT LΦ) with off-diagonality penalty off(X) = i=j x2 ij. Jacobi iteration: compose Φ = · · · R3R2R1 as a sequence of Givens rotations, where each new rotation tries to reduce the off-diagonal terms Analytic expression for optimal rotation for given pivot Rotation applied in place – no matrix multiplication Guaranteed decrease of the off-diagonal terms Jacobi 1846 35 / 85
    • Diagonalization of the Laplacian Eigendecomposition can be posed as the minimization problem min ΦT Φ=I off(ΦT LΦ) with off-diagonality penalty off(X) = i=j x2 ij. Jacobi iteration: compose Φ = · · · R3R2R1 as a sequence of Givens rotations, where each new rotation tries to reduce the off-diagonal terms Analytic expression for optimal rotation for given pivot Rotation applied in place – no matrix multiplication Guaranteed decrease of the off-diagonal terms Orthonormality guaranteed by construction Jacobi 1846 36 / 85
    • Joint approximate diagonalization Laplacians of X and Y are diagonalized independently: min ΦT Φ=I,ΨT Ψ=I off(ΦT LXΦ) + off(ΨT LY Ψ) φ2 φ3 φ4 φ5 ψ2 ψ3 ψ4 ψ5 Cardoso 1995; Eynard, Bronstein2 , Glashoff 2012 37 / 85
    • Joint approximate diagonalization Diagonalize Laplacians of X and Y simultaneously: min ˆΦ T ˆΦ=I off( ˆΦ T LX ˆΦ) + off( ˆΦ T LY ˆΦ) ˆφ2 ˆφ3 ˆφ4 ˆφ5 ˆφ2 ˆφ3 ˆφ4 ˆφ5 Cardoso 1995; Eynard, Bronstein2 , Glashoff 2012 38 / 85
    • Joint approximate diagonalization Diagonalize Laplacians of X and Y simultaneously: min ˆΦ T ˆΦ=I off( ˆΦ T LX ˆΦ) + off( ˆΦ T LY ˆΦ) In most cases, ˆΦ is only an approximate eigenbasis Cardoso 1995; Eynard, Bronstein2 , Glashoff 2012 39 / 85
    • Joint approximate diagonalization Diagonalize Laplacians of X and Y simultaneously: min ˆΦ T ˆΦ=I off( ˆΦ T LX ˆΦ) + off( ˆΦ T LY ˆΦ) In most cases, ˆΦ is only an approximate eigenbasis Modified Jacobi iteration (JADE): compose ˆΦ = · · · R3R2R1 as a sequence of Givens rotations, where each new rotation tries to reduce the off-diagonal terms Cardoso 1995; Eynard, Bronstein2 , Glashoff 2012 40 / 85
    • Joint approximate diagonalization Diagonalize Laplacians of X and Y simultaneously: min ˆΦ T ˆΦ=I off( ˆΦ T LX ˆΦ) + off( ˆΦ T LY ˆΦ) In most cases, ˆΦ is only an approximate eigenbasis Modified Jacobi iteration (JADE): compose ˆΦ = · · · R3R2R1 as a sequence of Givens rotations, where each new rotation tries to reduce the off-diagonal terms Overall complexity akin to the standard Jacobi iteration Cardoso 1995; Eynard, Bronstein2 , Glashoff 2012 41 / 85
    • Multimodal spectral clustering Modality 1 Modality 2 Multimodal (JADE) Ng et al. 2001; Eynard, Bronstein2 , Glashoff 2012 42 / 85
    • Multimodal spectral clustering Modality 1 Modality 2 Multimodal (JADE) Ng et al. 2001; Eynard, Bronstein2 , Glashoff 2012 43 / 85
    • Disambiguating NUS dataset 0.076955 0.056509 0.041308 0.029242 0.022934 0.022272 0.020757 0.0203 Subset of NUS-WIDE dataset Annotated images belonging to 7 ambiguous classes Modality 1: 1000-dimensional distributions of frequent tags Modality 2: 64-dimensional color histogram image descriptors Laplacians: Gaussian weights with 20 nearest neighbors Eynard, Bronstein2 , Glashoff 2012; data: Chua et al. 2009 44 / 85
    • Disambiguating NUS dataset california, pair, water, animals sunrise, sky, water, mountains, nature water, underwater, tiger, fauna, fishing water, wildlife, zoo, tiger, nature mountains, rocks, water, trees sunset, rock, tree, waterfall, forest waterfall, water, mountain, wood nature, ocean, sea, blue, pool, florida water, creek, rocks, waterfall, mountain underwater, nature, sea, coral, reef reef, underwater, sea, fish, coral reef, dive, scuba, underwater, fish underwater, pacific, fish, reef, macro sea, nature, scuba, ocean, blue, water maldives, coral, underwater, fish fish, scuba, water, diving, photography coral, diving, reef, nature, scuba explore, tropical, wildlife, coral, dive mountains, oregon, waterfalls, sunlight ocean waterfalls, trees, nature, river, sky tiger, bravo sea, wildlife, water, animal, ocean wild, cat, feline, asia, safari, stripes cat, nature, tiger, australia, beauty mountain, water, waterfall, nice, walk animal, ocean, animals Tag clusters (ambiguity e.g. between underwater tiger and water) Eynard, Bronstein2 , Glashoff 2012; data: Chua et al. 2009 45 / 85
    • Disambiguating NUS dataset sea, coral, reef sea, fish, coral underwater, fish fish, reef, macro ocean, blue, water underwater, fish diving, photography nature, scuba wildlife, coral, dive mountains, oregon, waterfalls, sunlight ocean waterfalls, trees, nature, river, sky tiger, bravo sea, wildlife, water, animal, ocean wild, cat, feline, asia, safari, stripes cat, nature, tiger, australia, beauty mountain, water, waterfall, nice, walk animal, ocean, animals Color histogram clusters (ambiguity between similarly colored images) Eynard, Bronstein2 , Glashoff 2012; data: Chua et al. 2009 46 / 85
    • Disambiguating NUS dataset Multimodal clusters Eynard, Bronstein2 , Glashoff 2012; data: Chua et al. 2009 47 / 85
    • Drawbacks of JADE In many applications, we do not need the whole basis, just the first k n eigenvectors 48 / 85
    • Drawbacks of JADE In many applications, we do not need the whole basis, just the first k n eigenvectors Explicit assumption of orthonormality of the joint basis restricts Laplacian discretization to symmetric matrices only 49 / 85
    • Drawbacks of JADE In many applications, we do not need the whole basis, just the first k n eigenvectors Explicit assumption of orthonormality of the joint basis restricts Laplacian discretization to symmetric matrices only Requires bijective known correspondence between X and Y 50 / 85
    • Bijective correspondence Chili, food San Francisco, USA Landrover, SUV Cayenne, city Cayenne, Porsche Cayenne, pepper Image space Tag space 1:1 51 / 85
    • Partial correspondence Chili, food San Francisco, USA Landrover, SUV Cayenne, city Cayenne, Porsche Cayenne, pepper Image space Tag space 52 / 85
    • Partial correspondence Chili, food San Francisco, USA Landrover, SUV Cayenne, city Cayenne, Porsche Cayenne, pepper Marijuana, cannabis Alligator Crocodile Bear Apple MacBook Orange Image space Tag space 53 / 85
    • Partial correspondence Two discrete manifolds with different number of vertices, X = {x1, . . . , xn} and Y = {x1, . . . , xm} 54 / 85
    • Partial correspondence Two discrete manifolds with different number of vertices, X = {x1, . . . , xn} and Y = {x1, . . . , xm} Laplacians LX of size n × n and LY of size m × m 55 / 85
    • Partial correspondence Two discrete manifolds with different number of vertices, X = {x1, . . . , xn} and Y = {x1, . . . , xm} Laplacians LX of size n × n and LY of size m × m Set of corresponding functions F = (f1, . . . , fq) and G = (g1, . . . , gq) 56 / 85
    • Partial correspondence Two discrete manifolds with different number of vertices, X = {x1, . . . , xn} and Y = {x1, . . . , xm} Laplacians LX of size n × n and LY of size m × m Set of corresponding functions F = (f1, . . . , fq) and G = (g1, . . . , gq) We cannot find a common eigenbasis ˆΦ of Laplacians LX and LY , because they now have different dimensions 57 / 85
    • Coupled diagonalization Find two sets of coupled approximate eigenvectors ˆΦ, ˆΨ min ˆΦ, ˆΨ off( ˆΦ T LX ˆΦ) + off( ˆΨ T LY ˆΨ) + µ FT ˆΦ − GT ˆΨ 2 F s.t. ˆΦ T ˆΦ = I, ˆΨ T ˆΨ = I Kovnatsky, Bronstein2 , Glashoff, Kimmel 2013 58 / 85
    • Perturbation of joint eigenbasis Theorem (Cardoso 1994) Let A = ΦΛΦT be a symmetric matrix with simple δ-separated spectrum (|λi − λj| ≥ δ) and B = ΦΛΦT + E. Then, the joint approximate eigenvectors of A, B satisfy ˆφi = φi + j=i αijφj + O( 2 ) where αij = φT i Eφj/2(λj − λi) ≤ E 2/2δ Cardoso 1994 59 / 85
    • Perturbation of joint eigenbasis Theorem (Cardoso 1994) Let A = ΦΛΦT be a symmetric matrix with simple δ-separated spectrum (|λi − λj| ≥ δ) and B = ΦΛΦT + E. Then, the joint approximate eigenvectors of A, B satisfy ˆφi = φi + j=i αijφj + O( 2 ) where αij = φT i Eφj/2(λj − λi) ≤ E 2/2δ Consequently, span{ˆφ1, . . . , ˆφk} ≈ span{φ1, . . . , φk} Cardoso 1994; Kovnatsky, Bronstein2 , Glashoff, Kimmel 2013 60 / 85
    • Perturbation of joint eigenbasis Theorem (Cardoso 1994) Let A = ΦΛΦT be a symmetric matrix with simple δ-separated spectrum (|λi − λj| ≥ δ) and B = ΦΛΦT + E. Then, the joint approximate eigenvectors of A, B satisfy ˆφi = φi + j=i αijφj + O( 2 ) where αij = φT i Eφj/2(λj − λi) ≤ E 2/2δ Consequently, span{ˆφ1, . . . , ˆφk} ≈ span{φ1, . . . , φk} i.e., k first approximate joint eigenvectors can be expressed as linear combinations of k ≥ k eigenvectors: ˆΦ ≈ ¯ΦS, ˆΨ ≈ ¯ΨR, where ¯Φ = (φ1, . . . , φk ), ¯ΛX = diag(λX 1 , . . . , λX k ) ¯Ψ = (ψ1, . . . , ψk ), ¯ΛY = diag(λY 1 , . . . , λY k ) Cardoso 1994; Kovnatsky, Bronstein2 , Glashoff, Kimmel 2013 61 / 85
    • Subspace coupled diagonalization Find two sets of coupled approximate eigenvectors ˆΦ, ˆΨ min ˆΦ, ˆΨ off( ˆΦ T LX ˆΦ) + off( ˆΨ T LY ˆΨ) + µ FT ˆΦ − GT ˆΨ 2 F s.t. ˆΦ T ˆΦ = I, ˆΨ T ˆΨ = I Coupling: given a set of corresponding vectors F, G, make their Fourier coefficients coincide ˆΦ T F ≈ ˆΨ T G Kovnatsky, Bronstein2 , Glashoff, Kimmel 2013 62 / 85
    • Subspace coupled diagonalization Find two sets of coupled approximate eigenvectors ˆΦ, ˆΨ min R,S off(RT ¯ΦT LX ¯ΦR) + off(ST ¯ΨT LY ¯ΨS) + µ FT ¯ΦR − GT ¯ΨS 2 F s.t. RT ¯ΦT ¯ΦR = I, ST ¯ΨT ¯ΨS = I Coupling: given a set of corresponding vectors F, G, make their Fourier coefficients coincide ˆΦ T F ≈ ˆΨ T G Based on perturbation Theorem, express the joint approximate eigenbases as linear combinations ˆΦ = ¯ΦS, ˆΨ = ¯ΨR Kovnatsky, Bronstein2 , Glashoff, Kimmel 2013; Cardoso 1994 63 / 85
    • Subspace coupled diagonalization Find two sets of coupled approximate eigenvectors ˆΦ, ˆΨ min R,S off(RT ¯ΦT LX ¯Φ ¯ΛX R) + off(ST ¯ΨT LY ¯Ψ ¯ΛY S) + µ FT ¯ΦR − GT ¯ΨS 2 F s.t. RT ¯ΦT ¯Φ I R = I, ST ¯ΨT ¯Ψ I S = I Coupling: given a set of corresponding vectors F, G, make their Fourier coefficients coincide ˆΦ T F ≈ ˆΨ T G Based on perturbation Theorem, express the joint approximate eigenbases as linear combinations ˆΦ = ¯ΦS, ˆΨ = ¯ΨR Kovnatsky, Bronstein2 , Glashoff, Kimmel 2013; Cardoso 1994 64 / 85
    • Subspace coupled diagonalization Find two sets of coupled approximate eigenvectors ˆΦ, ˆΨ min R,S off(RT ¯ΛXR) + off(ST ¯ΛY S) + µ FT ¯ΦR − GT ¯ΨS 2 F s.t. RT R = I, ST S = I Coupling: given a set of corresponding vectors F, G, make their Fourier coefficients coincide ˆΦ T F ≈ ˆΨ T G Based on perturbation Theorem, express the joint approximate eigenbases as linear combinations ˆΦ = ¯ΦS, ˆΨ = ¯ΨR Kovnatsky, Bronstein2 , Glashoff, Kimmel 2013; Cardoso 1994 65 / 85
    • Subspace coupled diagonalization Find two sets of coupled approximate eigenvectors ˆΦ, ˆΨ min R,S off(RT ¯ΛXR) + off(ST ¯ΛY S) + µ1 FT + ¯ΦR − GT + ¯ΨS 2 F +µ2 FT − ¯ΦR − GT − ¯ΨS 2 F s.t. RT R = I, ST S = I Coupling: given a set of corresponding vectors F, G, make their Fourier coefficients coincide ˆΦ T F ≈ ˆΨ T G Based on perturbation Theorem, express the joint approximate eigenbases as linear combinations ˆΦ = ¯ΦS, ˆΨ = ¯ΨR Decoupling: given a set of corresponding vectors F−, G−, make their Fourier coefficients as different as possible Kovnatsky, Bronstein2 , Glashoff, Kimmel 2013; Cardoso 1994 66 / 85
    • Subspace coupled diagonalization Find two sets of coupled approximate eigenvectors ˆΦ, ˆΨ min R,S off(RT ¯ΛXR) + off(ST ¯ΛY S) + µ1 FT + ¯ΦR − GT + ¯ΨS 2 F +µ2 FT − ¯ΦR − GT − ¯ΨS 2 F s.t. RT R = I, ST S = I Laplacians are not used explicitly: their first k eigenfunctions ¯Φ, ¯Ψ and eigenvalues ¯ΛX, ¯ΛY are pre-computed Kovnatsky, Bronstein2 , Glashoff, Kimmel 2013; Cardoso 1994 67 / 85
    • Subspace coupled diagonalization Find two sets of coupled approximate eigenvectors ˆΦ, ˆΨ min R,S off(RT ¯ΛXR) + off(ST ¯ΛY S) + µ1 FT + ¯ΦR − GT + ¯ΨS 2 F +µ2 FT − ¯ΦR − GT − ¯ΨS 2 F s.t. RT R = I, ST S = I Laplacians are not used explicitly: their first k eigenfunctions ¯Φ, ¯Ψ and eigenvalues ¯ΛX, ¯ΛY are pre-computed - any Laplacian can be used! Kovnatsky, Bronstein2 , Glashoff, Kimmel 2013; Cardoso 1994 68 / 85
    • Subspace coupled diagonalization Find two sets of coupled approximate eigenvectors ˆΦ, ˆΨ min R,S off(RT ¯ΛXR) + off(ST ¯ΛY S) + µ1 FT + ¯ΦR − GT + ¯ΨS 2 F +µ2 FT − ¯ΦR − GT − ¯ΨS 2 F s.t. RT R = I, ST S = I Laplacians are not used explicitly: their first k eigenfunctions ¯Φ, ¯Ψ and eigenvalues ¯ΛX, ¯ΛY are pre-computed - any Laplacian can be used! Problem size is 2k × k, independent of the number of samples Kovnatsky, Bronstein2 , Glashoff, Kimmel 2013; Cardoso 1994 69 / 85
    • Subspace coupled diagonalization Find two sets of coupled approximate eigenvectors ˆΦ, ˆΨ min R,S off(RT ¯ΛXR) + off(ST ¯ΛY S) + µ1 FT + ¯ΦR − GT + ¯ΨS 2 F +µ2 FT − ¯ΦR − GT − ¯ΨS 2 F s.t. RT R = I, ST S = I Laplacians are not used explicitly: their first k eigenfunctions ¯Φ, ¯Ψ and eigenvalues ¯ΛX, ¯ΛY are pre-computed - any Laplacian can be used! Problem size is 2k × k, independent of the number of samples No bijective correspondence Kovnatsky, Bronstein2 , Glashoff, Kimmel 2013; Cardoso 1994 70 / 85
    • Clustering results Accuracy (%) Method Circles Text Caltech NUS Digits Reuters #points 800 800 105 145 2000 600 Uncoupled 53.0 60.4 78.1 80.7 78.9 52.3 Harmonic Mean 95.6 97.2 87.6 89.0 87.0 52.3 Arithmetic Mean 96.5 96.9 87.6 95.2 82.8 52.2 Comraf 40.8 60.8 – 86.9 81.6 53.2 MVSC 95.6 97.2 81.0 89.0 83.1 52.3 MultiNMF 41.1 50.5 – 77.4 87.2 53.1 SC-ML 98.2 97.6 88.6 94.5 87.8 52.8 JADE 100 98.4 86.7 93.1 82.5 52.3 CD∗ pos 10% 52.5 54.5 78.7 78.6 94.2 53.7 20% 61.3 60.0 80.8 82.9 94.1 54.2 60% 93.7 86.5 87.0 87.2 93.9 54.7 100% 98.9 96.8 89.5 94.5 93.9 54.8 pos+neg 10% 67.3 63.6 86.5 92.7 94.9 59.0 20% 69.6 67.8 87.9 93.3 94.8 57.6 60% 95.2 87.0 89.2 94.5 94.8 57.0 Methods: Eynard 2012; Bekkerman 2007; Cai 2011; Liu 2013; Dong 2013; Data: Cai 2011; Chua 2009; Alpaydin 1998; Liu 2013; Amini 2009 71 / 85
    • Clustering results Accuracy (%) Method Circles Text Caltech NUS Digits Reuters #points 800 800 105 145 2000 600 Uncoupled 53.0 60.4 78.1 80.7 78.9 52.3 Harmonic Mean 95.6 97.2 87.6 89.0 87.0 52.3 Arithmetic Mean 96.5 96.9 87.6 95.2 82.8 52.2 Comraf 40.8 60.8 – 86.9 81.6 53.2 MVSC 95.6 97.2 81.0 89.0 83.1 52.3 MultiNMF 41.1 50.5 – 77.4 87.2 53.1 SC-ML 98.2 97.6 88.6 94.5 87.8 52.8 JADE 100 98.4 86.7 93.1 82.5 52.3 CD∗ pos 10% 52.5 54.5 78.7 78.6 94.2 53.7 20% 61.3 60.0 80.8 82.9 94.1 54.2 60% 93.7 86.5 87.0 87.2 93.9 54.7 100% 98.9 96.8 89.5 94.5 93.9 54.8 pos+neg 10% 67.3 63.6 86.5 92.7 94.9 59.0 20% 69.6 67.8 87.9 93.3 94.8 57.6 60% 95.2 87.0 89.2 94.5 94.8 57.0 Methods: Eynard 2012; Bekkerman 2007; Cai 2011; Liu 2013; Dong 2013; Data: Cai 2011; Chua 2009; Alpaydin 1998; Liu 2013; Amini 2009 72 / 85
    • Clustering results Accuracy (%) Method Circles Text Caltech NUS Digits Reuters #points 800 800 105 145 2000 600 Uncoupled 53.0 60.4 78.1 80.7 78.9 52.3 Harmonic Mean 95.6 97.2 87.6 89.0 87.0 52.3 Arithmetic Mean 96.5 96.9 87.6 95.2 82.8 52.2 Comraf 40.8 60.8 – 86.9 81.6 53.2 MVSC 95.6 97.2 81.0 89.0 83.1 52.3 MultiNMF 41.1 50.5 – 77.4 87.2 53.1 SC-ML 98.2 97.6 88.6 94.5 87.8 52.8 JADE 100 98.4 86.7 93.1 82.5 52.3 CD∗ pos 10% 52.5 54.5 78.7 78.6 94.2 53.7 20% 61.3 60.0 80.8 82.9 94.1 54.2 60% 93.7 86.5 87.0 87.2 93.9 54.7 100% 98.9 96.8 89.5 94.5 93.9 54.8 pos+neg 10% 67.3 63.6 86.5 92.7 94.9 59.0 20% 69.6 67.8 87.9 93.3 94.8 57.6 60% 95.2 87.0 89.2 94.5 94.8 57.0 Methods: Eynard 2012; Bekkerman 2007; Cai 2011; Liu 2013; Dong 2013; Data: Cai 2011; Chua 2009; Alpaydin 1998; Liu 2013; Amini 2009 73 / 85
    • Clustering results Accuracy (%) Method Circles Text Caltech NUS Digits Reuters #points 800 800 105 145 2000 600 Uncoupled 53.0 60.4 78.1 80.7 78.9 52.3 Harmonic Mean 95.6 97.2 87.6 89.0 87.0 52.3 Arithmetic Mean 96.5 96.9 87.6 95.2 82.8 52.2 Comraf 40.8 60.8 – 86.9 81.6 53.2 MVSC 95.6 97.2 81.0 89.0 83.1 52.3 MultiNMF 41.1 50.5 – 77.4 87.2 53.1 SC-ML 98.2 97.6 88.6 94.5 87.8 52.8 JADE 100 98.4 86.7 93.1 82.5 52.3 CD∗ pos 10% 52.5 54.5 78.7 78.6 94.2 53.7 20% 61.3 60.0 80.8 82.9 94.1 54.2 60% 93.7 86.5 87.0 87.2 93.9 54.7 100% 98.9 96.8 89.5 94.5 93.9 54.8 pos+neg 10% 67.3 63.6 86.5 92.7 94.9 59.0 20% 69.6 67.8 87.9 93.3 94.8 57.6 60% 95.2 87.0 89.2 94.5 94.8 57.0 Methods: Eynard 2012; Bekkerman 2007; Cai 2011; Liu 2013; Dong 2013; Data: Cai 2011; Chua 2009; Alpaydin 1998; Liu 2013; Amini 2009 74 / 85
    • Clustering results Accuracy (%) Method Circles Text Caltech NUS Digits Reuters #points 800 800 105 145 2000 600 Uncoupled 53.0 60.4 78.1 80.7 78.9 52.3 Harmonic Mean 95.6 97.2 87.6 89.0 87.0 52.3 Arithmetic Mean 96.5 96.9 87.6 95.2 82.8 52.2 Comraf 40.8 60.8 – 86.9 81.6 53.2 MVSC 95.6 97.2 81.0 89.0 83.1 52.3 MultiNMF 41.1 50.5 – 77.4 87.2 53.1 SC-ML 98.2 97.6 88.6 94.5 87.8 52.8 JADE 100 98.4 86.7 93.1 82.5 52.3 CD∗ pos 10% 52.5 54.5 78.7 78.6 94.2 53.7 20% 61.3 60.0 80.8 82.9 94.1 54.2 60% 93.7 86.5 87.0 87.2 93.9 54.7 100% 98.9 96.8 89.5 94.5 93.9 54.8 pos+neg 10% 67.3 63.6 86.5 92.7 94.9 59.0 20% 69.6 67.8 87.9 93.3 94.8 57.6 60% 95.2 87.0 89.2 94.5 94.8 57.0 Methods: Eynard 2012; Bekkerman 2007; Cai 2011; Liu 2013; Dong 2013; Data: Cai 2011; Chua 2009; Alpaydin 1998; Liu 2013; Amini 2009 75 / 85
    • Object classification Uncoupled 1 Uncoupled 2 JADE CD (pos) CD (pos+neg) 0.001 0.01 0.1 1 0.2 0.4 0.6 0.8 1 TPR 0.2 0.4 0.6 0.8 1 FPR 0.001 0.01 0.1 1 Uncoupled 1 Uncoupled 2 JAD CCO FPR CD (pos) CD (pos+neg) 76 / 85
    • Manifold Alignment 831 120×100 images of a human face 698 64×64 images of a statue manually coupled datasets, using 25 points sampled with FPS results compared to manifold alignment (MA) Ham, Lee, Saul 2005 77 / 85
    • Manifold Alignment MA CD Ham, Lee, Saul 2005; Eynard, Bronstein2 , Glashoff 2012 78 / 85
    • Summary Framework for multimodal data analysis 79 / 85
    • Summary Framework for multimodal data analysis working in the subspace of the eigenvectors of the Laplacians 80 / 85
    • Summary Framework for multimodal data analysis working in the subspace of the eigenvectors of the Laplacians ... and with only partial correspondences 81 / 85
    • Summary Framework for multimodal data analysis working in the subspace of the eigenvectors of the Laplacians ... and with only partial correspondences We have: some papers (see our Web pages) 82 / 85
    • Summary Framework for multimodal data analysis working in the subspace of the eigenvectors of the Laplacians ... and with only partial correspondences We have: some papers (see our Web pages) code and data 83 / 85
    • Summary Framework for multimodal data analysis working in the subspace of the eigenvectors of the Laplacians ... and with only partial correspondences We have: some papers (see our Web pages) code and data extensions to other applications / fields 84 / 85
    • Thank you! 85 / 85