Learning Spectral Graph Transformationsfor Link Prediction    Jérôme Kunegis & Andreas Lommatzsch    DAI-Labor, Technische...
Outline      The Problem          – Link prediction          – Known solutions      Learning          – Spectral transform...
The Problem: Link Prediction                                              Known links (A)                                 ...
Path Counting Follow paths Number of paths of length k given by Ak Nodes connected by many paths Nodes connected by short ...
Laplacian Link Prediction Functions Graph Laplacian L = D − A “Resistance Distance”                          L+     (a.k.a...
Computation of Link Prediction FunctionsAdjacency matrix              eigenvalue decomposition: A = UΛUT  Matrix polynomia...
Learning Spectral Transformations   Link prediction functions are spectral transformations of A or L                      ...
Finding the Best Spectral TransformationFind the best spectral transformation on test set B                          minF ...
Example: DBLP Citation Network (undirected)                              6                                   (UTBU)iiDBLP ...
Variants: Weighted and Signed Graphs  Weighted undirected graphs: use A and L = D − A as is  Signed graphs: use Dii = Σj |...
Bipartite GraphsBipartite graphs: paths have odd lengthCompute sum of odd powers of AThe resulting polynomial is odd αA³ +...
Singular Value Decomposition  Odd power of a bipartite graph’s adjacency matrixA2n+1 = [0 R; RT 0]2n+1 = [0 (RRT)nR; RT(RR...
SVD Example: Bipartite Rating Graph                          35  MovieLens                                αx⁵ + βx³ + γx  ...
Base Matrices  Base matrices A and L  Normalized variants: D−1/2AD−1/2, D−1/2LD−1/2 = I − D−1/2AD−1/2                     ...
Learning Laplacian Kernels  Epinions (signed user-user network), using LNote: Λii > 0 becausethe graph is signedand unbala...
Almost Bipartite NetworksDating site LíbímSeTi.cz: (users rate users)                                250                  ...
Learning the Reduced Rank kSome plots suggest a reduced rank k                                    (UTBU)ii ≈ 0          si...
Conclusion & Ongoing ResearchConclusions  Many link prediction functions are spectral transformations  Spectral transforma...
Upcoming SlideShare
Loading in …5
×

Learning Spectral Graph Transformations for Link Prediction

1,244 views
1,091 views

Published on

We present a unified framework for learning link prediction and edge weight prediction functions in large networks, based on the transformation of a graph’s algebraic spectrum. Our approach generalizes several graph kernels and dimensionality reduction methods and provides a method to estimate their parameters efficiently. We show how
the parameters of these prediction functions can be learned by reducing the problem to a one-dimensional regression problem whose runtime only depends on the method’s reduced rank and that can be inspected visually. We derive variants that apply to undirected, weighted, unweighted, unipartite and bipartite graphs. We evaluate our method
experimentally using examples from social networks, collaborative filtering, trust networks, citation networks, authorship graphs and hyperlink networks.

0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
1,244
On SlideShare
0
From Embeds
0
Number of Embeds
5
Actions
Shares
0
Downloads
12
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

Learning Spectral Graph Transformations for Link Prediction

  1. 1. Learning Spectral Graph Transformationsfor Link Prediction Jérôme Kunegis & Andreas Lommatzsch DAI-Labor, Technische Universität Berlin, Germany 26th Int. Conf. on Machine Learning, Montréal 2009
  2. 2. Outline The Problem – Link prediction – Known solutions Learning – Spectral transformations – Finding the best spectral transformation Variants – Weighted and signed graphs – Bipartite graphs and the SVD – Graph Laplacian and normalization Some Applications Kunegis & Lommatzsch Learning Spectral Graph Transformations for Link Prediction 1
  3. 3. The Problem: Link Prediction Known links (A) Links to be predicted (B) Motivation: Recommend connections in a social network Predict links in an undirected, unweighted network Using the adjacency matrices A and B, Find a function F(A) giving prediction values corresponding to B F(A) = B Kunegis & Lommatzsch Learning Spectral Graph Transformations for Link Prediction 2
  4. 4. Path Counting Follow paths Number of paths of length k given by Ak Nodes connected by many paths Nodes connected by short paths Weight powers of A: αA² + βA³ + γA⁴ … with α > β > γ … [ > 0 ] Examples: Exponential graph kernel: eαA = Σi αi/i! Ai Von Neumann kernel: (I − αA)−1 = Σi αi Ai (with 0 < α < 1) Kunegis & Lommatzsch Learning Spectral Graph Transformations for Link Prediction 3
  5. 5. Laplacian Link Prediction Functions Graph Laplacian L = D − A “Resistance Distance” L+ (a.k.a. commute time) Regularized Laplacian (I + αL)−1 Heat diffusion kernel e−αL Kunegis & Lommatzsch Learning Spectral Graph Transformations for Link Prediction 4
  6. 6. Computation of Link Prediction FunctionsAdjacency matrix eigenvalue decomposition: A = UΛUT Matrix polynomial Σi αiAi = U (Σi αiΛi) UT Matrix exponential eαA = U eαΛ UT Von Neumann kernel (I − αA)−1 = U (I − αΛ)−1 UT Rank reduction A(k) = U Λ(k) UTGraph Laplacian eigenvalue decomposition: L = D − A = UΛUT Resistance distance L+ = U Λ+ U T Regularized Laplacian (I + αL)−1 = U (I + αΛ)−1 UT Heat diffusion kernel e−αL = U e−αΛ UT Spectral transformation Kunegis & Lommatzsch Learning Spectral Graph Transformations for Link Prediction 5
  7. 7. Learning Spectral Transformations Link prediction functions are spectral transformations of A or L F(A) = UF(Λ)UT F(Λ)ii = f(Λii) A spectral transformation F corresponds to a function of reals fMatrix polynomial F(A) = Σi αiAi f(x) = Σi αixi Real polynomialMatrix exponential F(A) = eαA f(x) = eαx Real exponentialMatrix inverse F(A) = (I ± αA)−1 f(x) = 1 / (1 ± αx) Rational functionPseudoinverse F(A) = A+ f(x) = 1/x when x > 0, 0 otherwiseRank-k approximation F(A) = A(k) f(x) = x when |x| ≥ x0, 0 otherwise Kunegis & Lommatzsch Learning Spectral Graph Transformations for Link Prediction 6
  8. 8. Finding the Best Spectral TransformationFind the best spectral transformation on test set B minF ‖F(A) − B‖FReduce minimization problem = minF ‖UF(Λ) UT − B‖F = minF ‖F(Λ) − UTBU‖F (norm is preserved by U)Reduce to diagonal, because off-diagonal in F(Λ) is constant zero minf Σi (f(Λii) − (UTBU)ii)²The best spectral transformation is given by a one-dimensionalleast-squares problem Kunegis & Lommatzsch Learning Spectral Graph Transformations for Link Prediction 7
  9. 9. Example: DBLP Citation Network (undirected) 6 (UTBU)iiDBLP citation network 5Symmetric adjacency 4matrices A = UΛUT, B 3 2 αx³ + βx² + γx + δ βeαx 1 x when x < x0, else 0 Λii 0 −1 −30 −20 −10 0 10 20 30 Kunegis & Lommatzsch Learning Spectral Graph Transformations for Link Prediction 8
  10. 10. Variants: Weighted and Signed Graphs Weighted undirected graphs: use A and L = D − A as is Signed graphs: use Dii = Σj |Aij| (signed graph Laplacian) 10 Example: Slashdot Zoo 5(social network with negative edges) 0 −5 −30 −20 −10 0 10 20 30 40 Kunegis & Lommatzsch Learning Spectral Graph Transformations for Link Prediction 9
  11. 11. Bipartite GraphsBipartite graphs: paths have odd lengthCompute sum of odd powers of AThe resulting polynomial is odd αA³ + βA⁵ + …For other link prediction functions, use the odd component eαA sinh(αA) (I − αA)−1 αA (I − α²A²)−1 Kunegis & Lommatzsch Learning Spectral Graph Transformations for Link Prediction 10
  12. 12. Singular Value Decomposition Odd power of a bipartite graph’s adjacency matrixA2n+1 = [0 R; RT 0]2n+1 = [0 (RRT)nR; RT(RRT)n 0] Using the singular value decomposition R = UΣVT(RRT)nR = (UΣVT VΣUT)n UΣVT = (UΣ²UT)n UΣVT = UΣ2n+1VT Odd powers of A are given by odd spectral transformations of R Kunegis & Lommatzsch Learning Spectral Graph Transformations for Link Prediction 11
  13. 13. SVD Example: Bipartite Rating Graph 35 MovieLens αx⁵ + βx³ + γx rating graph 30 β sinh(αx) 25 α x (x < x0)Rating values sgn(x) (α |x|²+β|x|+γ (x < x0)) 20{−2, −1, 0, +1, +2} 15 10 5 0 −5 0 20 40 60 80 100 120 Kunegis & Lommatzsch Learning Spectral Graph Transformations for Link Prediction 12
  14. 14. Base Matrices Base matrices A and L Normalized variants: D−1/2AD−1/2, D−1/2LD−1/2 = I − D−1/2AD−1/2 αx² + βx + γ [α,β,γ≥0] Example: D−1/2AD−1/2 βeαxTrust network Advogato β / (1 – αx)Note: ignore eigenvalue 1(constant eigenvector) Kunegis & Lommatzsch Learning Spectral Graph Transformations for Link Prediction 13
  15. 15. Learning Laplacian Kernels Epinions (signed user-user network), using LNote: Λii > 0 becausethe graph is signedand unbalanced β / (1 + αx) β e−αx Kunegis & Lommatzsch Learning Spectral Graph Transformations for Link Prediction 14
  16. 16. Almost Bipartite NetworksDating site LíbímSeTi.cz: (users rate users) 250 200 αx³ + βx² + γx + δSome networks are“almost” bipartite 150 β sinh(αx)Plot has near 100central symmetry 50 0 −50 −100 −150Bipartition: men/women −200 −250 −800 −600 −400 −200 0 200 400 600 800 Kunegis & Lommatzsch Learning Spectral Graph Transformations for Link Prediction 15
  17. 17. Learning the Reduced Rank kSome plots suggest a reduced rank k (UTBU)ii ≈ 0 significant sing valuesExample: MovieLens/SVDLearned: k = 14 Kunegis & Lommatzsch Learning Spectral Graph Transformations for Link Prediction 16
  18. 18. Conclusion & Ongoing ResearchConclusions Many link prediction functions are spectral transformations Spectral transformations can be learnedOngoing Research New link prediction function: sinh(A), odd von Neumann (pseudo-)kernel Signed graph Laplacian Other matrix decompositions Other norms More datasets Thank You Kunegis & Lommatzsch Learning Spectral Graph Transformations for Link Prediction 17

×