Random Walk on Graphs
Pavan Kapanipathi
Reading Group (Kno.e.sis)
Referred: Purnamrita Sarkar, Random
Walks on Graphs: An ...
Agenda
• Introduction
– Motivation

• Background
– Graphs
– Matrices

• Random Walk
– PageRank
– Personalized PageRank
– T...
Random Walk

A drunk man will find his way home, but a drunk
bird may get lost forever.
Motivation: Link prediction in social
networks

4
Motivation: Basis for recommendation

5
Since I had very less slides and More
time – Graphs
• Undirected Graphs
Since I had very less slides and more
time in hand-- Graphs
• Directed Graphs
Since I had very less slides and more
time in hand – Matrix
Rows

Columns

i

j

i
j
k

i,j

k
Adjacency and Transition Matrix

Transition matrix P

Adjacency matrix A

1

1
1

1

1

1

1/2

1/2

9
Markov Property (Basic)
• Given the present state, the future and past
states are independent
Stochastic
• Wikipedia: In probability theory, a purely
stochastic system is one whose state is nondeterministic so that t...
Random Walk on Graphs
Random Walk on Graphs
Random Walk on Graphs
Random Walk on Graphs

The random sequence of points selected this
way is a random walk on the graph
Again: Transition Matrix
j
k
i
i
j
k

Transition matrix P
Probability?
1
1

1/2

1/2

16
Probability Distributions
• xt(i) = probability that the surfer is at node i at
time t
• xt+1(i) = ∑j(Probability of being...
Property of Adjacency Matrix

Adjacency matrix A

1
1

1

1
What is a stationary distribution?
Intuitively and Mathematically
• The stationary distribution at a node is related to th...
Eigen Value and Eigen Vector?
Interesting questions
• Does a stationary distribution always exist? Is it unique?
– Yes, if the graph is “well-behaved”.
...
Well behaved graphs
• Irreducible: There is a path from every node to every other
node.

What about connected undirected G...
Well behaved graphs
• Aperiodic: The GCD of all cycle lengths is 1. The GCD is also
called period.

Periodicity is 3

Aper...
Implications of the Perron Frobenius Theorem
• If a markov chain is irreducible and aperiodic then the largest
eigenvalue ...
Some fun stuff about undirected graphs
• A connected undirected graph is irreducible
• A connected non-bipartite undirecte...
Proximity measures from random walks
b

a

• How long does it take to hit node b in a random walk starting
at node a? --- ...
Hitting and Commute times
b

a

• Hitting time from node i to node j
– Expected number of hops to hit node j starting at n...
Hitting and Commute times
b

a

• Commute time between node i and j
– Is expected time to hit node j and come back to i

–...
Random Walk (versions)
• PageRank
– Personalized PageRank
– Topic Sensitive PageRank

• Recommender Systems (My interests)
Recommender Networks
• For a customer node i define similarity as
– H(i,j)
– C(i,j)
– Or the cosine similarity


Lij

Li...
PageRank (Initial)
• Intuition
– PageRank of “A” is higher if the pages that links to
“A” has higher PageRank
• User behav...
PageRank
• Intuitively
v (i ) 

v (j )

i degout ( j )
j


• v works out to be the stationary distribution of
the marko...
Pagerank & Perron-frobenius
• Perron Frobenius only holds if the graph is irreducible and
aperiodic.
• But how can we guar...
Pagerank
• We are looking for the vector v s.t.
v  (1  c ) vP  cr

• r is a distribution over web-pages.

• If r is the...
Personalized Pagerank1,2,3
• The only difference is that we use a non-uniform teleportation
distribution, i.e. at any time...
Topic-sensitive pagerank (Haveliwala’01)
• Divide the webpages into 16 broad categories
• For each category compute the bi...
Random Walk for Recommendations
• Collaborative Filtering by Shang et.al
• Graph
– Vertices: Users (U), Items (I), Item In...
Random Walk for Recommendations
Connectivity/Transition

Generally 0.85

Preference Vector
Most of it from
• Purnamrita Sarkar, Random Walks on Graphs:
An Overview
• Random Walks on Graphs: A Survey, Laszlo
Lov'as...
Thanks
Upcoming SlideShare
Loading in...5
×

Random walk on Graphs

483

Published on

Presented "Random Walk on Graphs" in the reading group for Knoesis. Specifically for Recommendation Context.
Referred: Purnamrita Sarkar, Random Walks on Graphs: An Overview

Published in: Education, Technology
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
483
On Slideshare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
15
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide
  • Probability of a random surfer to go from node k, j
  • Continuing the transition Matrix How many know how matrix multiplication
  • Probability of reaching node “j” from “I” node in path 2
  • Almost all vectors change direction when multiplied by a matrix A. Certain exceptional vectors x are in the same direction when multiplied with A. They are called Eigen Vectors.
  • 0 in P-Matrix will reoccur– Connected undirected graph is irreducible
  • Why is periodicity important?Because if it is periodic then the nodes will be reached infinite number of timesI will get into the details if time and necessary Periodicity[edit]A state i has period k if any return to state i must occur in multiples of k time steps. Formally, the period of a state is defined as(where "gcd" is the greatest common divisor). Note that even though a state has period k, it may not be possible to reach the state in k steps. For example, suppose it is possible to return to the state in {6, 8, 10, 12, ...} time steps; k would be 2, even though 2 does not appear in this list.If k = 1, then the state is said to be aperiodic: returns to state i can occur at irregular times. In other words, a state i is aperiodic if there exists n such that for all n' ≥ n,Otherwise (k > 1), the state is said to be periodic with period k. A Markov chain is aperiodic if every state is aperiodic. An irreducible markov chain only needs one aperiodic state to imply all states are aperiodic.Every state of a bipartite graph has an even period.
  • It is more intuitive
  • Very simple intuition
  • C is the damping factor
  • vP should not change the value
  • Manipulating these
  • Random walk on Graphs

    1. 1. Random Walk on Graphs Pavan Kapanipathi Reading Group (Kno.e.sis) Referred: Purnamrita Sarkar, Random Walks on Graphs: An Overview
    2. 2. Agenda • Introduction – Motivation • Background – Graphs – Matrices • Random Walk – PageRank – Personalized PageRank – Topic Sensitive PageRank • Applications – Specifically in Recommender Systems
    3. 3. Random Walk A drunk man will find his way home, but a drunk bird may get lost forever.
    4. 4. Motivation: Link prediction in social networks 4
    5. 5. Motivation: Basis for recommendation 5
    6. 6. Since I had very less slides and More time – Graphs • Undirected Graphs
    7. 7. Since I had very less slides and more time in hand-- Graphs • Directed Graphs
    8. 8. Since I had very less slides and more time in hand – Matrix Rows Columns i j i j k i,j k
    9. 9. Adjacency and Transition Matrix Transition matrix P Adjacency matrix A 1 1 1 1 1 1 1/2 1/2 9
    10. 10. Markov Property (Basic) • Given the present state, the future and past states are independent
    11. 11. Stochastic • Wikipedia: In probability theory, a purely stochastic system is one whose state is nondeterministic so that the subsequent state of the system is determined probabilistically. 1 1 • Matrix – Stochastic Row/Column? 1/2 1/2
    12. 12. Random Walk on Graphs
    13. 13. Random Walk on Graphs
    14. 14. Random Walk on Graphs
    15. 15. Random Walk on Graphs The random sequence of points selected this way is a random walk on the graph
    16. 16. Again: Transition Matrix j k i i j k Transition matrix P Probability? 1 1 1/2 1/2 16
    17. 17. Probability Distributions • xt(i) = probability that the surfer is at node i at time t • xt+1(i) = ∑j(Probability of being at node j)*Pr(j->i) =∑jxt(j)*P(j,i) • xt+1 = xtP = xt-1*P*P= xt-2*P*P*P = …=x0 Pt Matrix Multiplication? • What happens when the surfer keeps walking for a long time? 17
    18. 18. Property of Adjacency Matrix Adjacency matrix A 1 1 1 1
    19. 19. What is a stationary distribution? Intuitively and Mathematically • The stationary distribution at a node is related to the amount of time a random walker spends visiting that node. • Remember that we can write the probability distribution at a node as – xt+1 = xtP • For the stationary distribution v0 we have – v0 = v0 P • Whoa! that’s just the left eigenvector of the transition matrix ! 19
    20. 20. Eigen Value and Eigen Vector?
    21. 21. Interesting questions • Does a stationary distribution always exist? Is it unique? – Yes, if the graph is “well-behaved”. • What is “well-behaved”? – We shall talk about this soon. • How fast will the random surfer approach this stationary distribution? – Mixing Time! 21
    22. 22. Well behaved graphs • Irreducible: There is a path from every node to every other node. What about connected undirected Graph? Irreducible Not irreducible 22
    23. 23. Well behaved graphs • Aperiodic: The GCD of all cycle lengths is 1. The GCD is also called period. Periodicity is 3 Aperiodic 23
    24. 24. Implications of the Perron Frobenius Theorem • If a markov chain is irreducible and aperiodic then the largest eigenvalue of the transition matrix will be equal to 1 and all the other eigenvalues will be strictly less than 1. – Let the eigenvalues of P be {σi| i=0:n-1} in non-increasing order of σi . – σ0 = 1 > σ1 > σ2 >= ……>= σn • These results imply that for a well behaved graph there exists an unique stationary distribution. • More details when we discuss pagerank. 24
    25. 25. Some fun stuff about undirected graphs • A connected undirected graph is irreducible • A connected non-bipartite undirected graph has a stationary distribution proportional to the degree distribution! • Makes sense, since larger the degree of the node more likely a random walk is to come back to it. 25
    26. 26. Proximity measures from random walks b a • How long does it take to hit node b in a random walk starting at node a? --- Hitting time. • How long does it take to hit node b and come back to node a? --- Commute time. 26
    27. 27. Hitting and Commute times b a • Hitting time from node i to node j – Expected number of hops to hit node j starting at node i. – Is not symmetric. h(a,b) > h(a,b) – h(i,j) = 1 + ΣkЄnbs(A) p(i,k)h(k,j) 27
    28. 28. Hitting and Commute times b a • Commute time between node i and j – Is expected time to hit node j and come back to i – c(i,j) = h(i,j) + h(j,i) – Is symmetric. c(a,b) = c(b,a) 28
    29. 29. Random Walk (versions) • PageRank – Personalized PageRank – Topic Sensitive PageRank • Recommender Systems (My interests)
    30. 30. Recommender Networks • For a customer node i define similarity as – H(i,j) – C(i,j) – Or the cosine similarity  Lij  Lii Ljj • Now the question is how to compute these quantities quickly for very large graphs. – Fast iterative techniques (Brand 2005) – Fast Random Walk with Restart (Tong, Faloutsos 2006) – Finding nearest neighbors in graphs (Sarkar, Moore 2007) 30
    31. 31. PageRank (Initial) • Intuition – PageRank of “A” is higher if the pages that links to “A” has higher PageRank • User behavior where a surfer clicks on links at random with no regard towards content – One page's PageRank is not completely passed on to a page it links to, but is divided by the number of links on the page.
    32. 32. PageRank • Intuitively v (i )  v (j ) i degout ( j ) j  • v works out to be the stationary distribution of the markov chain corresponding to the web.
    33. 33. Pagerank & Perron-frobenius • Perron Frobenius only holds if the graph is irreducible and aperiodic. • But how can we guarantee that for the web graph? – Do it with a small restart probability c. • At any time-step the random surfer – jumps (teleport) to any other node with probability c – jumps to its direct neighbors with total probability 1-c. ~ P  (1  c )P  cU Uij  1 n i , j ~ P  cP  (1  c)U U ij  1 i, j n 33
    34. 34. Pagerank • We are looking for the vector v s.t. v  (1  c ) vP  cr • r is a distribution over web-pages. • If r is the uniform distribution we get pagerank. • What happens if r is non-uniform? Personalization 34
    35. 35. Personalized Pagerank1,2,3 • The only difference is that we use a non-uniform teleportation distribution, i.e. at any time step teleport to a set of webpages. • In other words we are looking for the vector v s.t. v  (1  c ) vP  cr • r is a non-uniform preference vector specific to an user. • v gives “personalized views” of the web. 1. Scaling Personalized Web Search, Jeh, Widom. 2003 2. Topic-sensitive PageRank, Haveliwala, 2001 3. Towards scaling fully personalized pagerank, D. Fogaras and B. Racz, 2004 35
    36. 36. Topic-sensitive pagerank (Haveliwala’01) • Divide the webpages into 16 broad categories • For each category compute the biased personalized pagerank vector by uniformly teleporting to websites under that category. • At query time the probability of the query being from any of the above classes is computed, and the final page-rank vector is computed by a linear combination of the biased pagerank vectors computed offline. 36
    37. 37. Random Walk for Recommendations • Collaborative Filtering by Shang et.al • Graph – Vertices: Users (U), Items (I), Item Information (T) and User Profiles (P) – Edges with weights • • • • u has a rating for i i has a tag for t u belongs to profile category p u to u if they are connected in the social network • Edge weights assignments
    38. 38. Random Walk for Recommendations Connectivity/Transition Generally 0.85 Preference Vector
    39. 39. Most of it from • Purnamrita Sarkar, Random Walks on Graphs: An Overview • Random Walks on Graphs: A Survey, Laszlo Lov'asz • OBVIOUSLY: Wikipedia :D • Random Walk on Graphs: Ankit Agarwal
    40. 40. Thanks
    1. A particular slide catching your eye?

      Clipping is a handy way to collect important slides you want to go back to later.

    ×