SlideShare a Scribd company logo
1 of 122
Download to read offline
Introduction to Complex Networks
V.A. Traag
KITLV, Leiden, the Netherlands
e-Humanities, KNAW, Amsterdam, the Netherlands
March 30, 2014
eRoyal Netherlands Academy of Arts and Sciences
Humanities
Overview
1 What are networks?
2 Classics: scale free & small worlds.
3 Percolation: giant components, failure & attack and epidemics.
Probability generating functions.
Examples
• Neural networks
• Power grids
• Gas networks
• Internet router network
• World Wide Web
• Road networks
• Airline networks
• Call networks
• Social networks
• Social media networks
Examples
• Neural networks
• Power grids
• Gas networks
• Internet router network
• World Wide Web
• Road networks
• Airline networks
• Call networks
• Social networks
• Social media networks
Examples
• Neural networks
• Power grids
• Gas networks
• Internet router network
• World Wide Web
• Road networks
• Airline networks
• Call networks
• Social networks
• Social media networks
Examples
• Neural networks
• Power grids
• Gas networks
• Internet router network
• World Wide Web
• Road networks
• Airline networks
• Call networks
• Social networks
• Social media networks
Examples
• Neural networks
• Power grids
• Gas networks
• Internet router network
• World Wide Web
• Road networks
• Airline networks
• Call networks
• Social networks
• Social media networks
Examples
• Neural networks
• Power grids
• Gas networks
• Internet router network
• World Wide Web
• Road networks
• Airline networks
• Call networks
• Social networks
• Social media networks
Basics
Network
• Graph or networks G = (V , E)
• Nodes V = 1, . . . , n (vertices)
Power station, webpage, intersection, person.
• Edges E ⊆ V × V (links, ties)
Power cables, hyperlinks, roads, friendships.
Can be directed, and possibly weighted
Essentials
• Degree ki is number of links at node i.
• If |E| = m number of edges, then i ki = 2m.
• Average degree k = 2m
n .
• Density p = m
(n
2)
= k
n−1 ≈ k
n .
• Most networks sparse: k , p low.
Picture worth . . . words
Visualisations essentially wrong, but sometimes insightful.
Need to assess statistics to understand networks.
Analysed properties
Analysis strategy
• Focus on some key (statistical) ingredients.
• Only overall general properties, no particulates.
• Compare to random graph: what can we expect?
• Modelling ⇒ replicate key properties.
Some key properties
• Degree distribution
• Degree correlations
• Path lengths
• Clustering
• Modularity
• Dynamics: inter event times
Small world
Milgram’s experiment (1960s)
• Ask people to reach specific person:
John Doe, Journalist, Kansas
• Send letter to acquaintance, who forwards, and so on
• Result: about 5 intermediaries to reach destination.
• Six degrees of separation.
Key question: is this different from a random graph?
Erd¨os-R´enyi (ER) graphs
Random graph
• Create empty graph G with n nodes.
• Every edge probability p of appearing.
• On average p n
2 = m edges.
• Average degree k = pn.
• Random graph essentially a (very simple) model.
• Was (and still is) used frequently.
Biology, epidemiology: well mixed population.
• Many interesting questions still.
Small world?
Path length
• Every node ki ≈ k , in steps, reach about k .
• When k = n reached whole network.
• Hence ≈ log n
log k : grows slowly!
Random edges create short paths.
Clustering
• Clustering, Ci =
ei
ki
2
.
• In ER graph Ci =
p3n(n − 1)
p2n(n − 1)
= p.
• Networks are sparse, low p, so low Ci .
Real world: both short paths & clustering. How to get that?
Small world?
Path length
• Every node ki ≈ k , in steps, reach about k .
• When k = n reached whole network.
• Hence ≈ log n
log k : grows slowly!
Random edges create short paths.
Clustering
• Clustering, Ci =
ei
ki
2
.
• In ER graph Ci =
p3n(n − 1)
p2n(n − 1)
= p.
• Networks are sparse, low p, so low Ci .
Real world: both short paths & clustering. How to get that?
Watts & Strogatz
Small world model
• Create lattice (connect to nearest neighbours).
• Rewire edge (or add) with probability p.
Watts & Strogatz
Watts & Strogatz
Small world model
• Create lattice (connect to nearest neighbours).
• Rewire edge (or add) with probability p.
Few shortcuts enough to create short paths
Degree distribution
0 20 40 60 80 100
ki ≈ k
Degree
Probability
• In real networks, power-law ki ∼ k−α, usually 2 < α < 3.
• In ER graphs, poisson ki ∼ k k
k! .
Degree distribution
100 101 102
Hubs
Degree
Probability
• In real networks, power-law ki ∼ k−α, usually 2 < α < 3.
• In ER graphs, poisson ki ∼ k k
k! .
Barab´asi & Albert
How to get power-law degree distribution?
Preferential attachment, cumulative advantage
Start with graph with q nodes
1 Add node
2 Add q links to previous nodes with probability pi ∼ ki
3 Repeat (1)-(2).
Results
• Analysis by master rate equation p(k) = k−1
2 p(k − 1) − k
2 p(k).
• Leads to p(k) = m(m+1)(m+2)
k(k+1)(k+2) ∼ k−3.
• Preferential attachment ⇒ scale free network.
Scale free
Scale free: so what?
Why does it matter?
• Scale free networks robust again random node failure.
• Vulnerable for targeted attacks (take out the hubs).
• No threshold for epidemic spreading.
Approach: percolation & generating functions.
Generating functions
Definition (Generating function)
Let Pr(S = k) = pk. Then
g(x) = E(xS
) =
k
pkxk
is the probability generating function (pgf).
Properties
• Normalized g(1) = E(1S ) = k pk = 1
• Calculate mean k = k kpk
• Sum S = Si , pgf f (x) = E(xS )
Generating functions
Definition (Generating function)
Let Pr(S = k) = pk. Then
g(x) = E(xS
) =
k
pkxk
is the probability generating function (pgf).
Properties
• Normalized g(1) = E(1S ) = k pk = 1
• Calculate mean k = k kpk
• Sum S = Si , pgf f (x) = E(xS )
Generating functions
Definition (Generating function)
Let Pr(S = k) = pk. Then
g(x) = E(xS
) =
k
pkxk
is the probability generating function (pgf).
Properties
• Normalized g(1) = E(1S ) = k pk = 1
• Calculate mean k = k kpk
• Sum S = Si , pgf f (x) = E(xS )
Generating functions
Definition (Generating function)
Let Pr(S = k) = pk. Then
g(x) = E(xS
) =
k
pkxk
is the probability generating function (pgf).
Properties
• Normalized g(1) = E(1S ) = k pk = 1
• Calculate mean k = k kpk1k−1
• Sum S = Si , pgf f (x) = E(xS )
Generating functions
Definition (Generating function)
Let Pr(S = k) = pk. Then
g(x) = E(xS
) =
k
pkxk
is the probability generating function (pgf).
Properties
• Normalized g(1) = E(1S ) = k pk = 1
• Calculate mean k = g (1)
• Sum S = Si , pgf f (x) = E(xS )
Generating functions
Definition (Generating function)
Let Pr(S = k) = pk. Then
g(x) = E(xS
) =
k
pkxk
is the probability generating function (pgf).
Properties
• Normalized g(1) = E(1S ) = k pk = 1
• Calculate moment km = x ∂
∂x
m
g
x=1
• Sum S = Si , pgf f (x) = E(xS )
Generating functions
Definition (Generating function)
Let Pr(S = k) = pk. Then
g(x) = E(xS
) =
k
pkxk
is the probability generating function (pgf).
Properties
• Normalized g(1) = E(1S ) = k pk = 1
• Calculate moment km = x ∂
∂x
m
g
x=1
• Sum S = Si , pgf f (x) = E(xS )
Generating functions
Definition (Generating function)
Let Pr(S = k) = pk. Then
g(x) = E(xS
) =
k
pkxk
is the probability generating function (pgf).
Properties
• Normalized g(1) = E(1S ) = k pk = 1
• Calculate moment km = x ∂
∂x
m
g
x=1
• Sum S = Si , pgf f (x) = E(x Si )
Generating functions
Definition (Generating function)
Let Pr(S = k) = pk. Then
g(x) = E(xS
) =
k
pkxk
is the probability generating function (pgf).
Properties
• Normalized g(1) = E(1S ) = k pk = 1
• Calculate moment km = x ∂
∂x
m
g
x=1
• Sum S = Si , pgf f (x) = E(xSi )
Generating functions
Definition (Generating function)
Let Pr(S = k) = pk. Then
g(x) = E(xS
) =
k
pkxk
is the probability generating function (pgf).
Properties
• Normalized g(1) = E(1S ) = k pk = 1
• Calculate moment km = x ∂
∂x
m
g
x=1
• Sum S = Si , pgf f (x) = g(x)
Generating functions
Definition (Generating function)
Let Pr(S = k) = pk. Then
g(x) = E(xS
) =
k
pkxk
is the probability generating function (pgf).
Properties
• Normalized g(1) = E(1S ) = k pk = 1
• Calculate moment km = x ∂
∂x
m
g
x=1
• Sum S = Si , pgf f (x) = g(x)m
Degree generating function
Example, ER degree distribution
• Let pk be probability node has degree k.
• Take pk = n
k pk(1 − p)n−k (Erd¨os-R´enyi)
• Then pgf g(x) = k
n
k pk(1 − p)n−kxk
• Normalized g(1) = e k (1−1) = 1.
• Mean g (x) = k e k (x−1).
• Number of neighbours of m nodes g(x)m = em k (x−1).
Degree generating function
Example, ER degree distribution
• Let pk be probability node has degree k.
• Take pk = n
k pk(1 − p)n−k (Erd¨os-R´enyi)
• Then pgf g(x) = k
n
k pk(1 − p)n−kxk
• Normalized g(1) = e k (1−1) = 1.
• Mean g (x) = k e k (x−1).
• Number of neighbours of m nodes g(x)m = em k (x−1).
Degree generating function
Example, ER degree distribution
• Let pk be probability node has degree k.
• Take pk = n
k pk(1 − p)n−k (Erd¨os-R´enyi)
• Then pgf g(x) = k
n
k (xp)k(1 − p)n−k (binomial theorem)
• Normalized g(1) = e k (1−1) = 1.
• Mean g (x) = k e k (x−1).
• Number of neighbours of m nodes g(x)m = em k (x−1).
Degree generating function
Example, ER degree distribution
• Let pk be probability node has degree k.
• Take pk = n
k pk(1 − p)n−k (Erd¨os-R´enyi)
• Then pgf g(x) = (px + (1 − p))n
• Normalized g(1) = e k (1−1) = 1.
• Mean g (x) = k e k (x−1).
• Number of neighbours of m nodes g(x)m = em k (x−1).
Degree generating function
Example, ER degree distribution
• Let pk be probability node has degree k.
• Take pk = n
k pk(1 − p)n−k (Erd¨os-R´enyi)
• Then pgf g(x) = (1 + p(x − 1))n (remember k = pn)
• Normalized g(1) = e k (1−1) = 1.
• Mean g (x) = k e k (x−1).
• Number of neighbours of m nodes g(x)m = em k (x−1).
Degree generating function
Example, ER degree distribution
• Let pk be probability node has degree k.
• Take pk = n
k pk(1 − p)n−k (Erd¨os-R´enyi)
• Then pgf g(x) = (1 + k (x−1)
n )n (limn→∞, def. exp)
• Normalized g(1) = e k (1−1) = 1.
• Mean g (x) = k e k (x−1).
• Number of neighbours of m nodes g(x)m = em k (x−1).
Degree generating function
Example, ER degree distribution
• Let pk be probability node has degree k.
• Take pk = n
k pk(1 − p)n−k (Erd¨os-R´enyi)
• Then pgf g(x) = e k (x−1)
• Normalized g(1) = e k (1−1) = 1.
• Mean g (x) = k e k (x−1).
• Number of neighbours of m nodes g(x)m = em k (x−1).
Degree generating function
Example, ER degree distribution
• Let pk be probability node has degree k.
• Take pk = n
k pk(1 − p)n−k (Erd¨os-R´enyi)
• Then pgf g(x) = e k (x−1)
• Normalized g(1) = e k (1−1) = 1.
• Mean g (x) = k e k (x−1).
• Number of neighbours of m nodes g(x)m = em k (x−1).
Degree generating function
Example, ER degree distribution
• Let pk be probability node has degree k.
• Take pk = n
k pk(1 − p)n−k (Erd¨os-R´enyi)
• Then pgf g(x) = e k (x−1)
• Normalized g(1) = e k (1−1) = 1.
• Mean g (x) = k e k (x−1).
• Number of neighbours of m nodes g(x)m = em k (x−1).
Degree generating function
Example, ER degree distribution
• Let pk be probability node has degree k.
• Take pk = n
k pk(1 − p)n−k (Erd¨os-R´enyi)
• Then pgf g(x) = e k (x−1)
• Normalized g(1) = e k (1−1) = 1.
• Mean g (1) = k .
• Number of neighbours of m nodes g(x)m = em k (x−1).
Degree generating function
Example, ER degree distribution
• Let pk be probability node has degree k.
• Take pk = n
k pk(1 − p)n−k (Erd¨os-R´enyi)
• Then pgf g(x) = e k (x−1)
• Normalized g(1) = e k (1−1) = 1.
• Mean g (1) = k .
• Number of neighbours of m nodes g(x)m = em k (x−1).
Degree generating function
Example, ER degree distribution
• Let pk be probability node has degree k.
• Take pk = n
k pk(1 − p)n−k (Erd¨os-R´enyi)
• Then pgf g(x) = e k (x−1)
• Normalized g(1) = e k (1−1) = 1.
• Mean g (1) = k .
• Number of neighbours of m nodes (g(x)m) = m k em k (x−1).
Degree generating function
Example, ER degree distribution
• Let pk be probability node has degree k.
• Take pk = n
k pk(1 − p)n−k (Erd¨os-R´enyi)
• Then pgf g(x) = e k (x−1)
• Normalized g(1) = e k (1−1) = 1.
• Mean g (1) = k .
• Number of neighbours of m nodes (g(1)m) = m k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k kpk
.
• Average neighbour degree k kpk
k .
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k k qkxk
• Second neigbhours for m degree g1(x)m
• Distribution of second neighbours m k pmp2(k|m)xk
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k kpk
.
• Average neighbour degree k kpk
k .
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k k qkxk
• Second neigbhours for m degree g1(x)m
• Distribution of second neighbours m k pmp2(k|m)xk
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k kpk
k .
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k k qkxk
• Second neigbhours for m degree g1(x)m
• Distribution of second neighbours m k pmp2(k|m)xk
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k kpk
k .
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k k qkxk
• Second neigbhours for m degree g1(x)m
• Distribution of second neighbours m k pmp2(k|m)xk
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k2
k .
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k k qkxk
• Second neigbhours for m degree g1(x)m
• Distribution of second neighbours m k pmp2(k|m)xk
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k2
k > k .
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k k qkxk
• Second neigbhours for m degree g1(x)m
• Distribution of second neighbours m k pmp2(k|m)xk
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k2
k − k > 0.
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k k qkxk
• Second neigbhours for m degree g1(x)m
• Distribution of second neighbours m k pmp2(k|m)xk
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k2 − k 2
k > 0.
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k k qkxk
• Second neigbhours for m degree g1(x)m
• Distribution of second neighbours m k pmp2(k|m)xk
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k2 − k 2
k > 0 (friendship paradox).
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k k qkxk
• Second neigbhours for m degree g1(x)m
• Distribution of second neighbours m k pmp2(k|m)xk
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k2 − k 2
k > 0 (friendship paradox).
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k k qkxk
• Second neigbhours for m degree g1(x)m
• Distribution of second neighbours m k pmp2(k|m)xk
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k2 − k 2
k > 0 (friendship paradox).
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k k qkxk
• Second neigbhours for m degree g1(x)m
• Distribution of second neighbours m k pmp2(k|m)xk
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k2 − k 2
k > 0 (friendship paradox).
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k k(k + 1)pk+1xk
• Second neigbhours for m degree g1(x)m
• Distribution of second neighbours m k pmp2(k|m)xk
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k2 − k 2
k > 0 (friendship paradox).
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k k kpkxk−1
• Second neigbhours for m degree g1(x)m
• Distribution of second neighbours m k pmp2(k|m)xk
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k2 − k 2
k > 0 (friendship paradox).
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k g (x)
• Second neigbhours for m degree g1(x)m
• Distribution of second neighbours m k pmp2(k|m)xk
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k2 − k 2
k > 0 (friendship paradox).
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k g (x)
• Second neigbhours for m degree g1(x)m
• Distribution of second neighbours m k pmp2(k|m)xk
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k2 − k 2
k > 0 (friendship paradox).
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k g (x)
• Second neigbhours for m degree g1(x)m = k p2(k|m)xk
• Distribution of second neighbours m k pmp2(k|m)xk
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k2 − k 2
k > 0 (friendship paradox).
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k g (x)
• Second neigbhours for m degree g1(x)m = k p2(k|m)xk
• Distribution of second neighbours m k pmp2(k|m)xk
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k2 − k 2
k > 0 (friendship paradox).
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k g (x)
• Second neigbhours for m degree g1(x)m = k p2(k|m)xk
• Distribution of second neighbours m pm k p2(k|m)xk
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k2 − k 2
k > 0 (friendship paradox).
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k g (x)
• Second neigbhours for m degree g1(x)m = k p2(k|m)xk
• Distribution of second neighbours m pmg1(x)m
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k2 − k 2
k > 0 (friendship paradox).
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k g (x)
• Second neigbhours for m degree g1(x)m = k p2(k|m)xk
• Distribution of second neighbours g(g1(x))
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k2 − k 2
k > 0 (friendship paradox).
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k g (x)
• Second neigbhours for m degree g1(x)m = k p2(k|m)xk
• Distribution of third neighbours g(g1(g1(x)))
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k2 − k 2
k > 0 (friendship paradox).
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k g (x)
• Second neigbhours for m degree g1(x)m = k p2(k|m)xk
• Distribution of d neighbours g(g1(· · · g1(x) · · · ))
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k2 − k 2
k > 0 (friendship paradox).
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k g (x)
• Second neigbhours for m degree g1(x)m = k p2(k|m)xk
• Distribution of d neighbours g(g1(· · · g1(x) · · · ))
• Average number of second neighbours k2 − k .
Giant component
Giant component (GC)
• Always only one GC (and lots of small ones).
• Probability link does not connect node to GC u.
• Probability node of degree k not in GC uk
• Probability node not in giant component k pkuk = g(u)
• Size of giant component: S = 1 − g(u).
But what is u?
Self consistency
• Probability link not connects to GC is u.
• Connects to node with k other neighbours: excess degree.
• Average probability: k qkuk = g1(u).
Giant component
Giant component (GC)
• Always only one GC (and lots of small ones).
• Probability link does not connect node to GC u.
• Probability node of degree k not in GC uk
• Probability node not in giant component k pkuk = g(u)
• Size of giant component: S = 1 − g(u).
But what is u?
Self consistency
• Probability link not connects to GC is u.
• Connects to node with k other neighbours: excess degree.
• Average probability: k qkuk = g1(u).
Giant component
Giant component (GC)
• Always only one GC (and lots of small ones).
• Probability link does not connect node to GC u.
• Probability node of degree k not in GC uk
• Probability node not in giant component k pkuk = g(u)
• Size of giant component: S = 1 − g(u).
But what is u?
Self consistency
• Probability link not connects to GC is u.
• Connects to node with k other neighbours: excess degree.
• Average probability: k qkuk = g1(u).
Giant component
Giant component (GC)
• Always only one GC (and lots of small ones).
• Probability link does not connect node to GC u.
• Probability node of degree k not in GC uk
• Probability node not in giant component k pkuk = g(u)
• Size of giant component: S = 1 − g(u).
But what is u?
Self consistency
• Probability link not connects to GC is u.
• Connects to node with k other neighbours: excess degree.
• Average probability: k qkuk = g1(u).
Giant component
Giant component (GC)
• Always only one GC (and lots of small ones).
• Probability link does not connect node to GC u.
• Probability node of degree k not in GC uk
• Probability node not in giant component k pkuk = g(u)
• Size of giant component: S = 1 − g(u).
But what is u?
Self consistency
• Probability link not connects to GC is u.
• Connects to node with k other neighbours: excess degree.
• Average probability: k qkuk = g1(u).
Giant component
Giant component (GC)
• Always only one GC (and lots of small ones).
• Probability link does not connect node to GC u.
• Probability node of degree k not in GC uk
• Probability node not in giant component k pkuk = g(u)
• Size of giant component: S = 1 − g(u).
But what is u?
Self consistency
• Probability link not connects to GC is u.
• Connects to node with k other neighbours: excess degree.
• Average probability: k qkuk = g1(u).
Giant component
Giant component (GC)
• Always only one GC (and lots of small ones).
• Probability link does not connect node to GC u.
• Probability node of degree k not in GC uk
• Probability node not in giant component k pkuk = g(u)
• Size of giant component: S = 1 − g(u).
But what is u?
Self consistency
• Probability link not connects to GC is u.
• Connects to node with k other neighbours: excess degree.
• Average probability: k qkuk = g1(u).
Giant component
Giant component (GC)
• Always only one GC (and lots of small ones).
• Probability link does not connect node to GC u.
• Probability node of degree k not in GC uk
• Probability node not in giant component k pkuk = g(u)
• Size of giant component: S = 1 − g(u).
But what is u?
Self consistency
• Probability link not connects to GC is u.
• Connects to node with k other neighbours: excess degree.
• Average probability: k qkuk = g1(u) = u.
Giant component
How to solve g1(u) = u?
0 0.2 0.4 0.6 0.8 1
0
0.2
0.4
0.6
0.8
1
g1(u)
u = g1(u)
u
Giant component
If derivative g1(1) > 1 giant component appears.
0 0.2 0.4 0.6 0.8 1
0
0.2
0.4
0.6
0.8
1
g (1)
u
Giant component
• GC appears when 1 < g1(1).
• For ER graphs k2 − k = k 2, so k > 1 the GC appears.
0 1 2 3 4 5 6
0
0.5
1
k
S
• For scale free graphs k2 → ∞ , so always GC (if 2 < α < 3).
Giant component
• GC appears when 1 < g1(1) = k kqk.
• For ER graphs k2 − k = k 2, so k > 1 the GC appears.
0 1 2 3 4 5 6
0
0.5
1
k
S
• For scale free graphs k2 → ∞ , so always GC (if 2 < α < 3).
Giant component
• GC appears when 1 < g1(1) = 1
k k k(k + 1)pk+1.
• For ER graphs k2 − k = k 2, so k > 1 the GC appears.
0 1 2 3 4 5 6
0
0.5
1
k
S
• For scale free graphs k2 → ∞ , so always GC (if 2 < α < 3).
Giant component
• GC appears when 1 < g1(1) = 1
k k(k − 1)kpk.
• For ER graphs k2 − k = k 2, so k > 1 the GC appears.
0 1 2 3 4 5 6
0
0.5
1
k
S
• For scale free graphs k2 → ∞ , so always GC (if 2 < α < 3).
Giant component
• GC appears when 1 < g1(1) = 1
k k k2pk − kpk.
• For ER graphs k2 − k = k 2, so k > 1 the GC appears.
0 1 2 3 4 5 6
0
0.5
1
k
S
• For scale free graphs k2 → ∞ , so always GC (if 2 < α < 3).
Giant component
• GC appears when 1 < g1(1) = k2 − k
k .
• For ER graphs k2 − k = k 2, so k > 1 the GC appears.
0 1 2 3 4 5 6
0
0.5
1
k
S
• For scale free graphs k2 → ∞ , so always GC (if 2 < α < 3).
Giant component
• GC appears when 1 < g1(1) = k2 − k
k .
• For ER graphs k2 − k = k 2, so k > 1 the GC appears.
0 1 2 3 4 5 6
0
0.5
1
k
S
• For scale free graphs k2 → ∞ , so always GC (if 2 < α < 3).
Giant component
• GC appears when 1 < g1(1) = k2 − k
k .
• For ER graphs k2 − k = k 2, so k > 1 the GC appears.
0 1 2 3 4 5 6
0
0.5
1
k
S
• For scale free graphs k2 → ∞ , so always GC (if 2 < α < 3).
Giant component
• GC appears when 1 < g1(1) = k2 − k
k .
• For ER graphs k2 − k = k 2, so k > 1 the GC appears.
0 1 2 3 4 5 6
0
0.5
1
k
S
• For scale free graphs k2 → ∞ , so always GC (if 2 < α < 3).
Node failure
How fast is giant component destroyed if nodes are removed?
Same approach
• Probability φ node does not “fail”.
• Again u probability link does not connect to GC.
Self consistency
• I: Neighbour is removed (1 − φ).
• II: Neighbour is not removed (φ), but not in GC (uk).
• So, probability is 1 − φ + φuk.
• On average k qk(1 − φ + φuk).
• Solve for u gives solution.
Node failure
How fast is giant component destroyed if nodes are removed?
Same approach
• Probability φ node “functions”.
• Again u probability link does not connect to GC.
Self consistency
• I: Neighbour is removed (1 − φ).
• II: Neighbour is not removed (φ), but not in GC (uk).
• So, probability is 1 − φ + φuk.
• On average k qk(1 − φ + φuk).
• Solve for u gives solution.
Node failure
How fast is giant component destroyed if nodes are removed?
Same approach
• Probability φ node not removed from network.
• Again u probability link does not connect to GC.
Self consistency
• I: Neighbour is removed (1 − φ).
• II: Neighbour is not removed (φ), but not in GC (uk).
• So, probability is 1 − φ + φuk.
• On average k qk(1 − φ + φuk).
• Solve for u gives solution.
Node failure
How fast is giant component destroyed if nodes are removed?
Same approach
• Probability φ node not removed from network.
• Again u probability link does not connect to GC.
Self consistency
• I: Neighbour is removed (1 − φ).
• II: Neighbour is not removed (φ), but not in GC (uk).
• So, probability is 1 − φ + φuk.
• On average k qk(1 − φ + φuk).
• Solve for u gives solution.
Node failure
How fast is giant component destroyed if nodes are removed?
Same approach
• Probability φ node not removed from network.
• Again u probability link does not connect to GC.
Self consistency
• I: Neighbour is removed (1 − φ).
• II: Neighbour is not removed (φ), but not in GC (uk).
• So, probability is 1 − φ + φuk.
• On average k qk(1 − φ + φuk).
• Solve for u gives solution.
Node failure
How fast is giant component destroyed if nodes are removed?
Same approach
• Probability φ node not removed from network.
• Again u probability link does not connect to GC.
Self consistency
• I: Neighbour is removed (1 − φ).
• II: Neighbour is not removed (φ), but not in GC (uk).
• So, probability is 1 − φ + φuk.
• On average k qk(1 − φ + φuk).
• Solve for u gives solution.
Node failure
How fast is giant component destroyed if nodes are removed?
Same approach
• Probability φ node not removed from network.
• Again u probability link does not connect to GC.
Self consistency
• I: Neighbour is removed (1 − φ).
• II: Neighbour is not removed (φ), but not in GC (uk).
• So, probability is 1 − φ + φuk.
• On average k qk(1 − φ + φuk).
• Solve for u gives solution.
Node failure
How fast is giant component destroyed if nodes are removed?
Same approach
• Probability φ node not removed from network.
• Again u probability link does not connect to GC.
Self consistency
• I: Neighbour is removed (1 − φ).
• II: Neighbour is not removed (φ), but not in GC (uk).
• So, probability is 1 − φ + φuk.
• On average 1 − φ + φ k qkuk.
• Solve for u gives solution.
Node failure
How fast is giant component destroyed if nodes are removed?
Same approach
• Probability φ node not removed from network.
• Again u probability link does not connect to GC.
Self consistency
• I: Neighbour is removed (1 − φ).
• II: Neighbour is not removed (φ), but not in GC (uk).
• So, probability is 1 − φ + φuk.
• On average 1 − φ + φg1(u).
• Solve for u gives solution.
Node failure
How fast is giant component destroyed if nodes are removed?
Same approach
• Probability φ node not removed from network.
• Again u probability link does not connect to GC.
Self consistency
• I: Neighbour is removed (1 − φ).
• II: Neighbour is not removed (φ), but not in GC (uk).
• So, probability is 1 − φ + φuk.
• On average 1 − φ + φg1(u) = u.
• Solve for u gives solution.
Node failure
How fast is giant component destroyed if nodes are removed?
Same approach
• Probability φ node not removed from network.
• Again u probability link does not connect to GC.
Self consistency
• I: Neighbour is removed (1 − φ).
• II: Neighbour is not removed (φ), but not in GC (uk).
• So, probability is 1 − φ + φuk.
• On average 1 − φ + φg1(u) = u.
• Solve for u gives solution.
Node failure
• Again, solving u = 1 − φ + φg1(u) not easy.
• But if ∂
∂u 1 − φ + φg1(u) > 1 GC exists.
• For ER φc = 1/ k , for scale free φc = 0.
0 0.2 0.4 0.6 0.8 1
0
0.2
0.4
0.6
0.8
1
ER
Scale Free
φ
S
Node failure
• Again, solving u = 1 − φ + φg1(u) not easy.
• But if ∂
∂u 1 − φ + φg1(u) > 1 GC exists.
• For ER φc = 1/ k , for scale free φc = 0.
0 0.2 0.4 0.6 0.8 1
0
0.2
0.4
0.6
0.8
1
ER
Scale Free
φ
S
Node failure
• Again, solving u = 1 − φ + φg1(u) not easy.
• But if φg1(u) > 1 GC exists.
• For ER φc = 1/ k , for scale free φc = 0.
0 0.2 0.4 0.6 0.8 1
0
0.2
0.4
0.6
0.8
1
ER
Scale Free
φ
S
Node failure
• Again, solving u = 1 − φ + φg1(u) not easy.
• But if φ > 1
g1(u)
GC exists.
• For ER φc = 1/ k , for scale free φc = 0.
0 0.2 0.4 0.6 0.8 1
0
0.2
0.4
0.6
0.8
1
ER
Scale Free
φ
S
Node failure
• Again, solving u = 1 − φ + φg1(u) not easy.
• But if φ > 1
g1(u)
= φc GC exists.
• For ER φc = 1/ k , for scale free φc = 0.
0 0.2 0.4 0.6 0.8 1
0
0.2
0.4
0.6
0.8
1
ER
Scale Free
φ
S
Node failure
• Again, solving u = 1 − φ + φg1(u) not easy.
• But if φ > 1
g1(u)
= φc GC exists.
• For ER φc = 1/ k , for scale free φc = 0.
0 0.2 0.4 0.6 0.8 1
0
0.2
0.4
0.6
0.8
1
ER
Scale Free
φ
S
Node failure
• Again, solving u = 1 − φ + φg1(u) not easy.
• But if φ > 1
g1(u)
= φc GC exists.
• For ER φc = 1/ k , for scale free φc = 0.
0 0.2 0.4 0.6 0.8 1
0
0.2
0.4
0.6
0.8
1
ER
Scale Free
φ
S
0 0.2
Node attack
What if we attack specific nodes?
Same approach
• Probability φk node of degree k does not “fail”.
• On average φ = k φkpk.
• Again u probability link does not connect to GC.
Self consistency
• I: Neighbour is removed (1 − φk).
• II: Neighbour is not removed (φk), but not in GC (uk−1).
• So on average u = k qk−1(1 − φk + φkuk−1).
• Define f (u) = k φkqk−1uk−1.
• Then u = 1 − f (1) + f (u), solve for u gives solution.
Node attack
What if we attack specific nodes?
Same approach
• Probability φk node of degree k does not “fail”.
• On average φ = k φkpk.
• Again u probability link does not connect to GC.
Self consistency
• I: Neighbour is removed (1 − φk).
• II: Neighbour is not removed (φk), but not in GC (uk−1).
• So on average u = k qk−1(1 − φk + φkuk−1).
• Define f (u) = k φkqk−1uk−1.
• Then u = 1 − f (1) + f (u), solve for u gives solution.
Node attack
What if we attack specific nodes?
Same approach
• Probability φk node of degree k does not “fail”.
• On average φ = k φkpk.
• Again u probability link does not connect to GC.
Self consistency
• I: Neighbour is removed (1 − φk).
• II: Neighbour is not removed (φk), but not in GC (uk−1).
• So on average u = k qk−1(1 − φk + φkuk−1).
• Define f (u) = k φkqk−1uk−1.
• Then u = 1 − f (1) + f (u), solve for u gives solution.
Node attack
What if we attack specific nodes?
Same approach
• Probability φk node of degree k does not “fail”.
• On average φ = k φkpk.
• Again u probability link does not connect to GC.
Self consistency
• I: Neighbour is removed (1 − φk).
• II: Neighbour is not removed (φk), but not in GC (uk−1).
• So on average u = k qk−1(1 − φk + φkuk−1).
• Define f (u) = k φkqk−1uk−1.
• Then u = 1 − f (1) + f (u), solve for u gives solution.
Node attack
What if we attack specific nodes?
Same approach
• Probability φk node of degree k does not “fail”.
• On average φ = k φkpk.
• Again u probability link does not connect to GC.
Self consistency
• I: Neighbour is removed (1 − φk).
• II: Neighbour is not removed (φk), but not in GC (uk−1).
• So on average u = k qk−1(1 − φk + φkuk−1).
• Define f (u) = k φkqk−1uk−1.
• Then u = 1 − f (1) + f (u), solve for u gives solution.
Node attack
What if we attack specific nodes?
Same approach
• Probability φk node of degree k does not “fail”.
• On average φ = k φkpk.
• Again u probability link does not connect to GC.
Self consistency
• I: Neighbour is removed (1 − φk).
• II: Neighbour is not removed (φk), but not in GC (uk−1).
• So on average u = k qk−1(1 − φk + φkuk−1).
• Define f (u) = k φkqk−1uk−1.
• Then u = 1 − f (1) + f (u), solve for u gives solution.
Node attack
What if we attack specific nodes?
Same approach
• Probability φk node of degree k does not “fail”.
• On average φ = k φkpk.
• Again u probability link does not connect to GC.
Self consistency
• I: Neighbour is removed (1 − φk).
• II: Neighbour is not removed (φk), but not in GC (uk−1).
• So on average u = k qk−1(1 − φk + φkuk−1).
• Define f (u) = k φkqk−1uk−1.
• Then u = 1 − f (1) + f (u), solve for u gives solution.
Failure and Attack
Epidemics
Disease spreading
• Standard models: Susceptable, Infected, Recovered.
• SIR: transmission rate β, recovery rate ν.
• Infect neighbour with probability φ = 1 − eβτ
• How far will it spread: giant component.
Percolation
• I: Disease not transmitted (1 − φ).
• II: Disease transmitted (φ), but not to GC (uk).
• Already solved: critical φc = 1
g1(u)
.
• Epidemiological threshold βτ = log k2 − k
k2 −2 k
Epidemics
Disease spreading
• Standard models: Susceptable, Infected, Recovered.
• SIR: transmission rate β, infectious time τ = 1/ν.
• Infect neighbour with probability φ = 1 − eβτ
• How far will it spread: giant component.
Percolation
• I: Disease not transmitted (1 − φ).
• II: Disease transmitted (φ), but not to GC (uk).
• Already solved: critical φc = 1
g1(u)
.
• Epidemiological threshold βτ = log k2 − k
k2 −2 k
Epidemics
Disease spreading
• Standard models: Susceptable, Infected, Recovered.
• SIR: transmission rate β, infectious time τ = 1/ν.
• Infect neighbour with probability φ = 1 − eβτ
• How far will it spread: giant component.
Percolation
• I: Disease not transmitted (1 − φ).
• II: Disease transmitted (φ), but not to GC (uk).
• Already solved: critical φc = 1
g1(u)
.
• Epidemiological threshold βτ = log k2 − k
k2 −2 k
Epidemics
Disease spreading
• Standard models: Susceptable, Infected, Recovered.
• SIR: transmission rate β, infectious time τ = 1/ν.
• Infect neighbour with probability φ = 1 − eβτ
• How far will it spread: giant component.
Percolation
• I: Disease not transmitted (1 − φ).
• II: Disease transmitted (φ), but not to GC (uk).
• Already solved: critical φc = 1
g1(u)
.
• Epidemiological threshold βτ = log k2 − k
k2 −2 k
Epidemics
Disease spreading
• Standard models: Susceptable, Infected, Recovered.
• SIR: transmission rate β, infectious time τ = 1/ν.
• Infect neighbour with probability φ = 1 − eβτ
• How far will it spread: giant component.
Percolation
• I: Disease not transmitted (1 − φ).
• II: Disease transmitted (φ), but not to GC (uk).
• Already solved: critical φc = k
k2 − k
.
• Epidemiological threshold βτ = log k2 − k
k2 −2 k
Epidemics
Disease spreading
• Standard models: Susceptable, Infected, Recovered.
• SIR: transmission rate β, infectious time τ = 1/ν.
• Infect neighbour with probability φ = 1 − eβτ
• How far will it spread: giant component.
Percolation
• I: Disease not transmitted (1 − φ).
• II: Disease transmitted (φ), but not to GC (uk).
• Already solved: critical φc = k
k2 − k
.
• Epidemiological threshold βτ = log k2 − k
k2 −2 k
Epidemics
Epidemic threshold
• For ER, threshold βτ = log k
k −1.
• For scale free, k2 diverges: always epidemic outbreak.
0 0.2 0.4 0.6 0.8 1
0
0.5
1
ER
Scale Free
φ
S
Conclusions
Models
• Short pats & clustering: small world model
• Scale free: preferential attachment
• Many other mechanisms: e.g. triadic closure, homophily, etc. . .
• Focus on stylistic features.
Analysis
• Scale-free networks robust, spread fast, but vulnerable for attack.
• Generating functions greatly help analysis.
• Compare observed network to random/model. How does it
deviate?
Questions?

More Related Content

What's hot

Algorithm analysis basics - Seven Functions/Big-Oh/Omega/Theta
Algorithm analysis basics - Seven Functions/Big-Oh/Omega/ThetaAlgorithm analysis basics - Seven Functions/Big-Oh/Omega/Theta
Algorithm analysis basics - Seven Functions/Big-Oh/Omega/ThetaPriyanka Rana
 
Artificial Intelligence Notes Unit 1
Artificial Intelligence Notes Unit 1 Artificial Intelligence Notes Unit 1
Artificial Intelligence Notes Unit 1 DigiGurukul
 
Dfs presentation
Dfs presentationDfs presentation
Dfs presentationAlizay Khan
 
Topological Sorting
Topological SortingTopological Sorting
Topological SortingShahDhruv21
 
Artificial Intelligence- TicTacToe game
Artificial Intelligence- TicTacToe gameArtificial Intelligence- TicTacToe game
Artificial Intelligence- TicTacToe gamemanika kumari
 
Python 3 Programming Language
Python 3 Programming LanguagePython 3 Programming Language
Python 3 Programming LanguageTahani Al-Manie
 
Zero to Hero - Introduction to Python3
Zero to Hero - Introduction to Python3Zero to Hero - Introduction to Python3
Zero to Hero - Introduction to Python3Chariza Pladin
 
String in python use of split method
String in python use of split methodString in python use of split method
String in python use of split methodvikram mahendra
 
Introduction to python
Introduction to pythonIntroduction to python
Introduction to pythonAgung Wahyudi
 
Suffix Tree and Suffix Array
Suffix Tree and Suffix ArraySuffix Tree and Suffix Array
Suffix Tree and Suffix ArrayHarshit Agarwal
 
Lecture 01: Machine Learning for Language Technology - Introduction
 Lecture 01: Machine Learning for Language Technology - Introduction Lecture 01: Machine Learning for Language Technology - Introduction
Lecture 01: Machine Learning for Language Technology - IntroductionMarina Santini
 
Basic data types in python
Basic data types in pythonBasic data types in python
Basic data types in pythonsunilchute1
 

What's hot (20)

Algorithm analysis basics - Seven Functions/Big-Oh/Omega/Theta
Algorithm analysis basics - Seven Functions/Big-Oh/Omega/ThetaAlgorithm analysis basics - Seven Functions/Big-Oh/Omega/Theta
Algorithm analysis basics - Seven Functions/Big-Oh/Omega/Theta
 
Chapter 5 of 1
Chapter 5 of 1Chapter 5 of 1
Chapter 5 of 1
 
Artificial Intelligence Notes Unit 1
Artificial Intelligence Notes Unit 1 Artificial Intelligence Notes Unit 1
Artificial Intelligence Notes Unit 1
 
Dfs presentation
Dfs presentationDfs presentation
Dfs presentation
 
Topological Sorting
Topological SortingTopological Sorting
Topological Sorting
 
Learning in AI
Learning in AILearning in AI
Learning in AI
 
Artificial Intelligence- TicTacToe game
Artificial Intelligence- TicTacToe gameArtificial Intelligence- TicTacToe game
Artificial Intelligence- TicTacToe game
 
Csc446: Pattern Recognition
Csc446: Pattern Recognition Csc446: Pattern Recognition
Csc446: Pattern Recognition
 
Python 3 Programming Language
Python 3 Programming LanguagePython 3 Programming Language
Python 3 Programming Language
 
Unit 1 chapter 1 Design and Analysis of Algorithms
Unit 1   chapter 1 Design and Analysis of AlgorithmsUnit 1   chapter 1 Design and Analysis of Algorithms
Unit 1 chapter 1 Design and Analysis of Algorithms
 
Zero to Hero - Introduction to Python3
Zero to Hero - Introduction to Python3Zero to Hero - Introduction to Python3
Zero to Hero - Introduction to Python3
 
Kmp
KmpKmp
Kmp
 
String in python use of split method
String in python use of split methodString in python use of split method
String in python use of split method
 
Introduction to python
Introduction to pythonIntroduction to python
Introduction to python
 
Hashing
HashingHashing
Hashing
 
Suffix Tree and Suffix Array
Suffix Tree and Suffix ArraySuffix Tree and Suffix Array
Suffix Tree and Suffix Array
 
Python
PythonPython
Python
 
Lecture 01: Machine Learning for Language Technology - Introduction
 Lecture 01: Machine Learning for Language Technology - Introduction Lecture 01: Machine Learning for Language Technology - Introduction
Lecture 01: Machine Learning for Language Technology - Introduction
 
Basic data types in python
Basic data types in pythonBasic data types in python
Basic data types in python
 
Lecture3 - Machine Learning
Lecture3 - Machine LearningLecture3 - Machine Learning
Lecture3 - Machine Learning
 

Viewers also liked

Graph theory concepts complex networks presents-rouhollah nabati
Graph theory concepts   complex networks presents-rouhollah nabatiGraph theory concepts   complex networks presents-rouhollah nabati
Graph theory concepts complex networks presents-rouhollah nabatinabati
 
Big data 43 (guardian) final
Big data 43 (guardian) finalBig data 43 (guardian) final
Big data 43 (guardian) finalKeisha Taylor
 
Complex contagion of campaign donations
Complex contagion of campaign donationsComplex contagion of campaign donations
Complex contagion of campaign donationsVincent Traag
 
Networks .ppt
Networks .pptNetworks .ppt
Networks .pptbrisso99
 
Mind the Graph! A Discussion on the Design of the Network
Mind the Graph! A Discussion on the Design of the NetworkMind the Graph! A Discussion on the Design of the Network
Mind the Graph! A Discussion on the Design of the NetworkPaolo Ciuccarelli
 
Albert Laszlo Barabasi - Innovation inspired positive change in health care
Albert Laszlo Barabasi - Innovation inspired positive change in health careAlbert Laszlo Barabasi - Innovation inspired positive change in health care
Albert Laszlo Barabasi - Innovation inspired positive change in health careponencias_mihealth2012
 
How to Make Awesome SlideShares: Tips & Tricks
How to Make Awesome SlideShares: Tips & TricksHow to Make Awesome SlideShares: Tips & Tricks
How to Make Awesome SlideShares: Tips & TricksSlideShare
 
Getting Started With SlideShare
Getting Started With SlideShareGetting Started With SlideShare
Getting Started With SlideShareSlideShare
 

Viewers also liked (9)

Graph theory concepts complex networks presents-rouhollah nabati
Graph theory concepts   complex networks presents-rouhollah nabatiGraph theory concepts   complex networks presents-rouhollah nabati
Graph theory concepts complex networks presents-rouhollah nabati
 
Big data 43 (guardian) final
Big data 43 (guardian) finalBig data 43 (guardian) final
Big data 43 (guardian) final
 
Complex contagion of campaign donations
Complex contagion of campaign donationsComplex contagion of campaign donations
Complex contagion of campaign donations
 
Networks .ppt
Networks .pptNetworks .ppt
Networks .ppt
 
Boekhouden - Introductie
Boekhouden - IntroductieBoekhouden - Introductie
Boekhouden - Introductie
 
Mind the Graph! A Discussion on the Design of the Network
Mind the Graph! A Discussion on the Design of the NetworkMind the Graph! A Discussion on the Design of the Network
Mind the Graph! A Discussion on the Design of the Network
 
Albert Laszlo Barabasi - Innovation inspired positive change in health care
Albert Laszlo Barabasi - Innovation inspired positive change in health careAlbert Laszlo Barabasi - Innovation inspired positive change in health care
Albert Laszlo Barabasi - Innovation inspired positive change in health care
 
How to Make Awesome SlideShares: Tips & Tricks
How to Make Awesome SlideShares: Tips & TricksHow to Make Awesome SlideShares: Tips & Tricks
How to Make Awesome SlideShares: Tips & Tricks
 
Getting Started With SlideShare
Getting Started With SlideShareGetting Started With SlideShare
Getting Started With SlideShare
 

Similar to Introduction to complex networks

Topology Matters in Communication
Topology Matters in CommunicationTopology Matters in Communication
Topology Matters in Communicationcseiitgn
 
Relaxation methods for the matrix exponential on large networks
Relaxation methods for the matrix exponential on large networksRelaxation methods for the matrix exponential on large networks
Relaxation methods for the matrix exponential on large networksDavid Gleich
 
ANU ASTR 4004 / 8004 Astronomical Computing : Lecture 9
ANU ASTR 4004 / 8004 Astronomical Computing : Lecture 9ANU ASTR 4004 / 8004 Astronomical Computing : Lecture 9
ANU ASTR 4004 / 8004 Astronomical Computing : Lecture 9tingyuansenastro
 
Sampling from Massive Graph Streams: A Unifying Framework
Sampling from Massive Graph Streams: A Unifying FrameworkSampling from Massive Graph Streams: A Unifying Framework
Sampling from Massive Graph Streams: A Unifying FrameworkNesreen K. Ahmed
 
Social Network Analysis
Social Network AnalysisSocial Network Analysis
Social Network Analysisrik0
 
Target tracking suing multiple auxiliary particle filtering
Target tracking suing multiple auxiliary particle filteringTarget tracking suing multiple auxiliary particle filtering
Target tracking suing multiple auxiliary particle filteringLuis Úbeda Medina
 
SPDE presentation 2012
SPDE presentation 2012SPDE presentation 2012
SPDE presentation 2012Zheng Mengdi
 
5 DimensionalityReduction.pdf
5 DimensionalityReduction.pdf5 DimensionalityReduction.pdf
5 DimensionalityReduction.pdfRahul926331
 
implementing the encryption in the JAVA.ppt
implementing the encryption in the JAVA.pptimplementing the encryption in the JAVA.ppt
implementing the encryption in the JAVA.pptMuhammadAbdullah311866
 
Delayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsDelayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsChristian Robert
 
Control of Discrete-Time Piecewise Affine Probabilistic Systems using Reachab...
Control of Discrete-Time Piecewise Affine Probabilistic Systems using Reachab...Control of Discrete-Time Piecewise Affine Probabilistic Systems using Reachab...
Control of Discrete-Time Piecewise Affine Probabilistic Systems using Reachab...Leo Asselborn
 
13_Unsupervised Learning.pdf
13_Unsupervised Learning.pdf13_Unsupervised Learning.pdf
13_Unsupervised Learning.pdfEmanAsem4
 
DimensionalityReduction.pptx
DimensionalityReduction.pptxDimensionalityReduction.pptx
DimensionalityReduction.pptx36rajneekant
 
Introduction to Neural networks (under graduate course) Lecture 6 of 9
Introduction to Neural networks (under graduate course) Lecture 6 of 9Introduction to Neural networks (under graduate course) Lecture 6 of 9
Introduction to Neural networks (under graduate course) Lecture 6 of 9Randa Elanwar
 
Tutorial 4 (duplicate detection)
Tutorial 4 (duplicate detection)Tutorial 4 (duplicate detection)
Tutorial 4 (duplicate detection)Kira
 
ANU ASTR 4004 / 8004 Astronomical Computing : Lecture 3
ANU ASTR 4004 / 8004 Astronomical Computing : Lecture 3ANU ASTR 4004 / 8004 Astronomical Computing : Lecture 3
ANU ASTR 4004 / 8004 Astronomical Computing : Lecture 3tingyuansenastro
 

Similar to Introduction to complex networks (20)

Topology Matters in Communication
Topology Matters in CommunicationTopology Matters in Communication
Topology Matters in Communication
 
Relaxation methods for the matrix exponential on large networks
Relaxation methods for the matrix exponential on large networksRelaxation methods for the matrix exponential on large networks
Relaxation methods for the matrix exponential on large networks
 
ANU ASTR 4004 / 8004 Astronomical Computing : Lecture 9
ANU ASTR 4004 / 8004 Astronomical Computing : Lecture 9ANU ASTR 4004 / 8004 Astronomical Computing : Lecture 9
ANU ASTR 4004 / 8004 Astronomical Computing : Lecture 9
 
Sampling from Massive Graph Streams: A Unifying Framework
Sampling from Massive Graph Streams: A Unifying FrameworkSampling from Massive Graph Streams: A Unifying Framework
Sampling from Massive Graph Streams: A Unifying Framework
 
Mae331 lecture1
Mae331 lecture1Mae331 lecture1
Mae331 lecture1
 
Optim_methods.pdf
Optim_methods.pdfOptim_methods.pdf
Optim_methods.pdf
 
Social Network Analysis
Social Network AnalysisSocial Network Analysis
Social Network Analysis
 
Target tracking suing multiple auxiliary particle filtering
Target tracking suing multiple auxiliary particle filteringTarget tracking suing multiple auxiliary particle filtering
Target tracking suing multiple auxiliary particle filtering
 
SPDE presentation 2012
SPDE presentation 2012SPDE presentation 2012
SPDE presentation 2012
 
5 DimensionalityReduction.pdf
5 DimensionalityReduction.pdf5 DimensionalityReduction.pdf
5 DimensionalityReduction.pdf
 
implementing the encryption in the JAVA.ppt
implementing the encryption in the JAVA.pptimplementing the encryption in the JAVA.ppt
implementing the encryption in the JAVA.ppt
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
Delayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsDelayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithms
 
Control of Discrete-Time Piecewise Affine Probabilistic Systems using Reachab...
Control of Discrete-Time Piecewise Affine Probabilistic Systems using Reachab...Control of Discrete-Time Piecewise Affine Probabilistic Systems using Reachab...
Control of Discrete-Time Piecewise Affine Probabilistic Systems using Reachab...
 
13_Unsupervised Learning.pdf
13_Unsupervised Learning.pdf13_Unsupervised Learning.pdf
13_Unsupervised Learning.pdf
 
DimensionalityReduction.pptx
DimensionalityReduction.pptxDimensionalityReduction.pptx
DimensionalityReduction.pptx
 
Introduction to Neural networks (under graduate course) Lecture 6 of 9
Introduction to Neural networks (under graduate course) Lecture 6 of 9Introduction to Neural networks (under graduate course) Lecture 6 of 9
Introduction to Neural networks (under graduate course) Lecture 6 of 9
 
Tutorial 4 (duplicate detection)
Tutorial 4 (duplicate detection)Tutorial 4 (duplicate detection)
Tutorial 4 (duplicate detection)
 
2018 MUMS Fall Course - Sampling-based techniques for uncertainty propagation...
2018 MUMS Fall Course - Sampling-based techniques for uncertainty propagation...2018 MUMS Fall Course - Sampling-based techniques for uncertainty propagation...
2018 MUMS Fall Course - Sampling-based techniques for uncertainty propagation...
 
ANU ASTR 4004 / 8004 Astronomical Computing : Lecture 3
ANU ASTR 4004 / 8004 Astronomical Computing : Lecture 3ANU ASTR 4004 / 8004 Astronomical Computing : Lecture 3
ANU ASTR 4004 / 8004 Astronomical Computing : Lecture 3
 

More from Vincent Traag

Peer review uncertainty at the institutional level
Peer review uncertainty at the institutional levelPeer review uncertainty at the institutional level
Peer review uncertainty at the institutional levelVincent Traag
 
Replacing peer review by metrics in the UK REF?
Replacing peer review by metrics in the UK REF?Replacing peer review by metrics in the UK REF?
Replacing peer review by metrics in the UK REF?Vincent Traag
 
Use of the journal impact factor for assessing individual articles need not b...
Use of the journal impact factor for assessing individual articles need not b...Use of the journal impact factor for assessing individual articles need not b...
Use of the journal impact factor for assessing individual articles need not b...Vincent Traag
 
Uncovering important intermediate publications
Uncovering important intermediate publicationsUncovering important intermediate publications
Uncovering important intermediate publicationsVincent Traag
 
Polarization and consensus in citation networks
Polarization and consensus in citation networksPolarization and consensus in citation networks
Polarization and consensus in citation networksVincent Traag
 
Community structure in complex networks
Community structure in complex networksCommunity structure in complex networks
Community structure in complex networksVincent Traag
 
Public thesis defence: groups and reputation in social networks
Public thesis defence: groups and reputation in social networksPublic thesis defence: groups and reputation in social networks
Public thesis defence: groups and reputation in social networksVincent Traag
 
Structure of media attention
Structure of media attentionStructure of media attention
Structure of media attentionVincent Traag
 
Dynamics of Media Attention
Dynamics of Media AttentionDynamics of Media Attention
Dynamics of Media AttentionVincent Traag
 
Dynamical Models Explaining Social Balance
Dynamical Models Explaining Social BalanceDynamical Models Explaining Social Balance
Dynamical Models Explaining Social BalanceVincent Traag
 
Significant scales in community structure
Significant scales in community structureSignificant scales in community structure
Significant scales in community structureVincent Traag
 
Reconstructing Third World Elite Rotation Events from Newspapers
Reconstructing Third World Elite Rotation Events from NewspapersReconstructing Third World Elite Rotation Events from Newspapers
Reconstructing Third World Elite Rotation Events from NewspapersVincent Traag
 
Reputation Dynamics Through Gossiping
Reputation Dynamics Through GossipingReputation Dynamics Through Gossiping
Reputation Dynamics Through GossipingVincent Traag
 
Limits of community detection
Limits of community detectionLimits of community detection
Limits of community detectionVincent Traag
 
Cooperation, Reputation & Gossiping
Cooperation, Reputation & GossipingCooperation, Reputation & Gossiping
Cooperation, Reputation & GossipingVincent Traag
 
Resolution-free community detection
Resolution-free community detectionResolution-free community detection
Resolution-free community detectionVincent Traag
 
Cooperation, Reputation & Gossiping
Cooperation, Reputation & GossipingCooperation, Reputation & Gossiping
Cooperation, Reputation & GossipingVincent Traag
 
Exponential Ranking: Taking into account negative links.
Exponential Ranking: Taking into account negative links.Exponential Ranking: Taking into account negative links.
Exponential Ranking: Taking into account negative links.Vincent Traag
 
Social Event Detection
Social Event DetectionSocial Event Detection
Social Event DetectionVincent Traag
 
Cooperation, Reputation & Gossiping
Cooperation, Reputation & GossipingCooperation, Reputation & Gossiping
Cooperation, Reputation & GossipingVincent Traag
 

More from Vincent Traag (20)

Peer review uncertainty at the institutional level
Peer review uncertainty at the institutional levelPeer review uncertainty at the institutional level
Peer review uncertainty at the institutional level
 
Replacing peer review by metrics in the UK REF?
Replacing peer review by metrics in the UK REF?Replacing peer review by metrics in the UK REF?
Replacing peer review by metrics in the UK REF?
 
Use of the journal impact factor for assessing individual articles need not b...
Use of the journal impact factor for assessing individual articles need not b...Use of the journal impact factor for assessing individual articles need not b...
Use of the journal impact factor for assessing individual articles need not b...
 
Uncovering important intermediate publications
Uncovering important intermediate publicationsUncovering important intermediate publications
Uncovering important intermediate publications
 
Polarization and consensus in citation networks
Polarization and consensus in citation networksPolarization and consensus in citation networks
Polarization and consensus in citation networks
 
Community structure in complex networks
Community structure in complex networksCommunity structure in complex networks
Community structure in complex networks
 
Public thesis defence: groups and reputation in social networks
Public thesis defence: groups and reputation in social networksPublic thesis defence: groups and reputation in social networks
Public thesis defence: groups and reputation in social networks
 
Structure of media attention
Structure of media attentionStructure of media attention
Structure of media attention
 
Dynamics of Media Attention
Dynamics of Media AttentionDynamics of Media Attention
Dynamics of Media Attention
 
Dynamical Models Explaining Social Balance
Dynamical Models Explaining Social BalanceDynamical Models Explaining Social Balance
Dynamical Models Explaining Social Balance
 
Significant scales in community structure
Significant scales in community structureSignificant scales in community structure
Significant scales in community structure
 
Reconstructing Third World Elite Rotation Events from Newspapers
Reconstructing Third World Elite Rotation Events from NewspapersReconstructing Third World Elite Rotation Events from Newspapers
Reconstructing Third World Elite Rotation Events from Newspapers
 
Reputation Dynamics Through Gossiping
Reputation Dynamics Through GossipingReputation Dynamics Through Gossiping
Reputation Dynamics Through Gossiping
 
Limits of community detection
Limits of community detectionLimits of community detection
Limits of community detection
 
Cooperation, Reputation & Gossiping
Cooperation, Reputation & GossipingCooperation, Reputation & Gossiping
Cooperation, Reputation & Gossiping
 
Resolution-free community detection
Resolution-free community detectionResolution-free community detection
Resolution-free community detection
 
Cooperation, Reputation & Gossiping
Cooperation, Reputation & GossipingCooperation, Reputation & Gossiping
Cooperation, Reputation & Gossiping
 
Exponential Ranking: Taking into account negative links.
Exponential Ranking: Taking into account negative links.Exponential Ranking: Taking into account negative links.
Exponential Ranking: Taking into account negative links.
 
Social Event Detection
Social Event DetectionSocial Event Detection
Social Event Detection
 
Cooperation, Reputation & Gossiping
Cooperation, Reputation & GossipingCooperation, Reputation & Gossiping
Cooperation, Reputation & Gossiping
 

Recently uploaded

Abnormal LFTs rate of deco and NAFLD.pptx
Abnormal LFTs rate of deco and NAFLD.pptxAbnormal LFTs rate of deco and NAFLD.pptx
Abnormal LFTs rate of deco and NAFLD.pptxzeus70441
 
Introduction of Human Body & Structure of cell.pptx
Introduction of Human Body & Structure of cell.pptxIntroduction of Human Body & Structure of cell.pptx
Introduction of Human Body & Structure of cell.pptxMedical College
 
LESSON PLAN IN SCIENCE GRADE 4 WEEK 1 DAY 2
LESSON PLAN IN SCIENCE GRADE 4 WEEK 1 DAY 2LESSON PLAN IN SCIENCE GRADE 4 WEEK 1 DAY 2
LESSON PLAN IN SCIENCE GRADE 4 WEEK 1 DAY 2AuEnriquezLontok
 
Oxo-Acids of Halogens and their Salts.pptx
Oxo-Acids of Halogens and their Salts.pptxOxo-Acids of Halogens and their Salts.pptx
Oxo-Acids of Halogens and their Salts.pptxfarhanvvdk
 
Environmental Acoustics- Speech interference level, acoustics calibrator.pptx
Environmental Acoustics- Speech interference level, acoustics calibrator.pptxEnvironmental Acoustics- Speech interference level, acoustics calibrator.pptx
Environmental Acoustics- Speech interference level, acoustics calibrator.pptxpriyankatabhane
 
Q4-Mod-1c-Quiz-Projectile-333344444.pptx
Q4-Mod-1c-Quiz-Projectile-333344444.pptxQ4-Mod-1c-Quiz-Projectile-333344444.pptx
Q4-Mod-1c-Quiz-Projectile-333344444.pptxtuking87
 
办理麦克马斯特大学毕业证成绩单|购买加拿大文凭证书
办理麦克马斯特大学毕业证成绩单|购买加拿大文凭证书办理麦克马斯特大学毕业证成绩单|购买加拿大文凭证书
办理麦克马斯特大学毕业证成绩单|购买加拿大文凭证书zdzoqco
 
Charateristics of the Angara-A5 spacecraft launched from the Vostochny Cosmod...
Charateristics of the Angara-A5 spacecraft launched from the Vostochny Cosmod...Charateristics of the Angara-A5 spacecraft launched from the Vostochny Cosmod...
Charateristics of the Angara-A5 spacecraft launched from the Vostochny Cosmod...Christina Parmionova
 
DNA isolation molecular biology practical.pptx
DNA isolation molecular biology practical.pptxDNA isolation molecular biology practical.pptx
DNA isolation molecular biology practical.pptxGiDMOh
 
Observational constraints on mergers creating magnetism in massive stars
Observational constraints on mergers creating magnetism in massive starsObservational constraints on mergers creating magnetism in massive stars
Observational constraints on mergers creating magnetism in massive starsSérgio Sacani
 
projectile motion, impulse and moment
projectile  motion, impulse  and  momentprojectile  motion, impulse  and  moment
projectile motion, impulse and momentdonamiaquintan2
 
Immunoblott technique for protein detection.ppt
Immunoblott technique for protein detection.pptImmunoblott technique for protein detection.ppt
Immunoblott technique for protein detection.pptAmirRaziq1
 
Fertilization: Sperm and the egg—collectively called the gametes—fuse togethe...
Fertilization: Sperm and the egg—collectively called the gametes—fuse togethe...Fertilization: Sperm and the egg—collectively called the gametes—fuse togethe...
Fertilization: Sperm and the egg—collectively called the gametes—fuse togethe...D. B. S. College Kanpur
 
final waves properties grade 7 - third quarter
final waves properties grade 7 - third quarterfinal waves properties grade 7 - third quarter
final waves properties grade 7 - third quarterHanHyoKim
 
GLYCOSIDES Classification Of GLYCOSIDES Chemical Tests Glycosides
GLYCOSIDES Classification Of GLYCOSIDES  Chemical Tests GlycosidesGLYCOSIDES Classification Of GLYCOSIDES  Chemical Tests Glycosides
GLYCOSIDES Classification Of GLYCOSIDES Chemical Tests GlycosidesNandakishor Bhaurao Deshmukh
 
Pests of Sunflower_Binomics_Identification_Dr.UPR
Pests of Sunflower_Binomics_Identification_Dr.UPRPests of Sunflower_Binomics_Identification_Dr.UPR
Pests of Sunflower_Binomics_Identification_Dr.UPRPirithiRaju
 
GenAI talk for Young at Wageningen University & Research (WUR) March 2024
GenAI talk for Young at Wageningen University & Research (WUR) March 2024GenAI talk for Young at Wageningen University & Research (WUR) March 2024
GenAI talk for Young at Wageningen University & Research (WUR) March 2024Jene van der Heide
 
FBI Profiling - Forensic Psychology.pptx
FBI Profiling - Forensic Psychology.pptxFBI Profiling - Forensic Psychology.pptx
FBI Profiling - Forensic Psychology.pptxPayal Shrivastava
 

Recently uploaded (20)

Abnormal LFTs rate of deco and NAFLD.pptx
Abnormal LFTs rate of deco and NAFLD.pptxAbnormal LFTs rate of deco and NAFLD.pptx
Abnormal LFTs rate of deco and NAFLD.pptx
 
Introduction of Human Body & Structure of cell.pptx
Introduction of Human Body & Structure of cell.pptxIntroduction of Human Body & Structure of cell.pptx
Introduction of Human Body & Structure of cell.pptx
 
LESSON PLAN IN SCIENCE GRADE 4 WEEK 1 DAY 2
LESSON PLAN IN SCIENCE GRADE 4 WEEK 1 DAY 2LESSON PLAN IN SCIENCE GRADE 4 WEEK 1 DAY 2
LESSON PLAN IN SCIENCE GRADE 4 WEEK 1 DAY 2
 
PLASMODIUM. PPTX
PLASMODIUM. PPTXPLASMODIUM. PPTX
PLASMODIUM. PPTX
 
Oxo-Acids of Halogens and their Salts.pptx
Oxo-Acids of Halogens and their Salts.pptxOxo-Acids of Halogens and their Salts.pptx
Oxo-Acids of Halogens and their Salts.pptx
 
Environmental Acoustics- Speech interference level, acoustics calibrator.pptx
Environmental Acoustics- Speech interference level, acoustics calibrator.pptxEnvironmental Acoustics- Speech interference level, acoustics calibrator.pptx
Environmental Acoustics- Speech interference level, acoustics calibrator.pptx
 
Q4-Mod-1c-Quiz-Projectile-333344444.pptx
Q4-Mod-1c-Quiz-Projectile-333344444.pptxQ4-Mod-1c-Quiz-Projectile-333344444.pptx
Q4-Mod-1c-Quiz-Projectile-333344444.pptx
 
办理麦克马斯特大学毕业证成绩单|购买加拿大文凭证书
办理麦克马斯特大学毕业证成绩单|购买加拿大文凭证书办理麦克马斯特大学毕业证成绩单|购买加拿大文凭证书
办理麦克马斯特大学毕业证成绩单|购买加拿大文凭证书
 
Charateristics of the Angara-A5 spacecraft launched from the Vostochny Cosmod...
Charateristics of the Angara-A5 spacecraft launched from the Vostochny Cosmod...Charateristics of the Angara-A5 spacecraft launched from the Vostochny Cosmod...
Charateristics of the Angara-A5 spacecraft launched from the Vostochny Cosmod...
 
DNA isolation molecular biology practical.pptx
DNA isolation molecular biology practical.pptxDNA isolation molecular biology practical.pptx
DNA isolation molecular biology practical.pptx
 
Observational constraints on mergers creating magnetism in massive stars
Observational constraints on mergers creating magnetism in massive starsObservational constraints on mergers creating magnetism in massive stars
Observational constraints on mergers creating magnetism in massive stars
 
projectile motion, impulse and moment
projectile  motion, impulse  and  momentprojectile  motion, impulse  and  moment
projectile motion, impulse and moment
 
Immunoblott technique for protein detection.ppt
Immunoblott technique for protein detection.pptImmunoblott technique for protein detection.ppt
Immunoblott technique for protein detection.ppt
 
Fertilization: Sperm and the egg—collectively called the gametes—fuse togethe...
Fertilization: Sperm and the egg—collectively called the gametes—fuse togethe...Fertilization: Sperm and the egg—collectively called the gametes—fuse togethe...
Fertilization: Sperm and the egg—collectively called the gametes—fuse togethe...
 
Interferons.pptx.
Interferons.pptx.Interferons.pptx.
Interferons.pptx.
 
final waves properties grade 7 - third quarter
final waves properties grade 7 - third quarterfinal waves properties grade 7 - third quarter
final waves properties grade 7 - third quarter
 
GLYCOSIDES Classification Of GLYCOSIDES Chemical Tests Glycosides
GLYCOSIDES Classification Of GLYCOSIDES  Chemical Tests GlycosidesGLYCOSIDES Classification Of GLYCOSIDES  Chemical Tests Glycosides
GLYCOSIDES Classification Of GLYCOSIDES Chemical Tests Glycosides
 
Pests of Sunflower_Binomics_Identification_Dr.UPR
Pests of Sunflower_Binomics_Identification_Dr.UPRPests of Sunflower_Binomics_Identification_Dr.UPR
Pests of Sunflower_Binomics_Identification_Dr.UPR
 
GenAI talk for Young at Wageningen University & Research (WUR) March 2024
GenAI talk for Young at Wageningen University & Research (WUR) March 2024GenAI talk for Young at Wageningen University & Research (WUR) March 2024
GenAI talk for Young at Wageningen University & Research (WUR) March 2024
 
FBI Profiling - Forensic Psychology.pptx
FBI Profiling - Forensic Psychology.pptxFBI Profiling - Forensic Psychology.pptx
FBI Profiling - Forensic Psychology.pptx
 

Introduction to complex networks

  • 1. Introduction to Complex Networks V.A. Traag KITLV, Leiden, the Netherlands e-Humanities, KNAW, Amsterdam, the Netherlands March 30, 2014 eRoyal Netherlands Academy of Arts and Sciences Humanities
  • 2. Overview 1 What are networks? 2 Classics: scale free & small worlds. 3 Percolation: giant components, failure & attack and epidemics. Probability generating functions.
  • 3. Examples • Neural networks • Power grids • Gas networks • Internet router network • World Wide Web • Road networks • Airline networks • Call networks • Social networks • Social media networks
  • 4. Examples • Neural networks • Power grids • Gas networks • Internet router network • World Wide Web • Road networks • Airline networks • Call networks • Social networks • Social media networks
  • 5. Examples • Neural networks • Power grids • Gas networks • Internet router network • World Wide Web • Road networks • Airline networks • Call networks • Social networks • Social media networks
  • 6. Examples • Neural networks • Power grids • Gas networks • Internet router network • World Wide Web • Road networks • Airline networks • Call networks • Social networks • Social media networks
  • 7. Examples • Neural networks • Power grids • Gas networks • Internet router network • World Wide Web • Road networks • Airline networks • Call networks • Social networks • Social media networks
  • 8. Examples • Neural networks • Power grids • Gas networks • Internet router network • World Wide Web • Road networks • Airline networks • Call networks • Social networks • Social media networks
  • 9. Basics Network • Graph or networks G = (V , E) • Nodes V = 1, . . . , n (vertices) Power station, webpage, intersection, person. • Edges E ⊆ V × V (links, ties) Power cables, hyperlinks, roads, friendships. Can be directed, and possibly weighted Essentials • Degree ki is number of links at node i. • If |E| = m number of edges, then i ki = 2m. • Average degree k = 2m n . • Density p = m (n 2) = k n−1 ≈ k n . • Most networks sparse: k , p low.
  • 10. Picture worth . . . words Visualisations essentially wrong, but sometimes insightful. Need to assess statistics to understand networks.
  • 11. Analysed properties Analysis strategy • Focus on some key (statistical) ingredients. • Only overall general properties, no particulates. • Compare to random graph: what can we expect? • Modelling ⇒ replicate key properties. Some key properties • Degree distribution • Degree correlations • Path lengths • Clustering • Modularity • Dynamics: inter event times
  • 12. Small world Milgram’s experiment (1960s) • Ask people to reach specific person: John Doe, Journalist, Kansas • Send letter to acquaintance, who forwards, and so on • Result: about 5 intermediaries to reach destination. • Six degrees of separation. Key question: is this different from a random graph?
  • 13. Erd¨os-R´enyi (ER) graphs Random graph • Create empty graph G with n nodes. • Every edge probability p of appearing. • On average p n 2 = m edges. • Average degree k = pn. • Random graph essentially a (very simple) model. • Was (and still is) used frequently. Biology, epidemiology: well mixed population. • Many interesting questions still.
  • 14. Small world? Path length • Every node ki ≈ k , in steps, reach about k . • When k = n reached whole network. • Hence ≈ log n log k : grows slowly! Random edges create short paths. Clustering • Clustering, Ci = ei ki 2 . • In ER graph Ci = p3n(n − 1) p2n(n − 1) = p. • Networks are sparse, low p, so low Ci . Real world: both short paths & clustering. How to get that?
  • 15. Small world? Path length • Every node ki ≈ k , in steps, reach about k . • When k = n reached whole network. • Hence ≈ log n log k : grows slowly! Random edges create short paths. Clustering • Clustering, Ci = ei ki 2 . • In ER graph Ci = p3n(n − 1) p2n(n − 1) = p. • Networks are sparse, low p, so low Ci . Real world: both short paths & clustering. How to get that?
  • 16. Watts & Strogatz Small world model • Create lattice (connect to nearest neighbours). • Rewire edge (or add) with probability p.
  • 18. Watts & Strogatz Small world model • Create lattice (connect to nearest neighbours). • Rewire edge (or add) with probability p. Few shortcuts enough to create short paths
  • 19. Degree distribution 0 20 40 60 80 100 ki ≈ k Degree Probability • In real networks, power-law ki ∼ k−α, usually 2 < α < 3. • In ER graphs, poisson ki ∼ k k k! .
  • 20. Degree distribution 100 101 102 Hubs Degree Probability • In real networks, power-law ki ∼ k−α, usually 2 < α < 3. • In ER graphs, poisson ki ∼ k k k! .
  • 21. Barab´asi & Albert How to get power-law degree distribution? Preferential attachment, cumulative advantage Start with graph with q nodes 1 Add node 2 Add q links to previous nodes with probability pi ∼ ki 3 Repeat (1)-(2). Results • Analysis by master rate equation p(k) = k−1 2 p(k − 1) − k 2 p(k). • Leads to p(k) = m(m+1)(m+2) k(k+1)(k+2) ∼ k−3. • Preferential attachment ⇒ scale free network.
  • 22. Scale free Scale free: so what? Why does it matter? • Scale free networks robust again random node failure. • Vulnerable for targeted attacks (take out the hubs). • No threshold for epidemic spreading. Approach: percolation & generating functions.
  • 23. Generating functions Definition (Generating function) Let Pr(S = k) = pk. Then g(x) = E(xS ) = k pkxk is the probability generating function (pgf). Properties • Normalized g(1) = E(1S ) = k pk = 1 • Calculate mean k = k kpk • Sum S = Si , pgf f (x) = E(xS )
  • 24. Generating functions Definition (Generating function) Let Pr(S = k) = pk. Then g(x) = E(xS ) = k pkxk is the probability generating function (pgf). Properties • Normalized g(1) = E(1S ) = k pk = 1 • Calculate mean k = k kpk • Sum S = Si , pgf f (x) = E(xS )
  • 25. Generating functions Definition (Generating function) Let Pr(S = k) = pk. Then g(x) = E(xS ) = k pkxk is the probability generating function (pgf). Properties • Normalized g(1) = E(1S ) = k pk = 1 • Calculate mean k = k kpk • Sum S = Si , pgf f (x) = E(xS )
  • 26. Generating functions Definition (Generating function) Let Pr(S = k) = pk. Then g(x) = E(xS ) = k pkxk is the probability generating function (pgf). Properties • Normalized g(1) = E(1S ) = k pk = 1 • Calculate mean k = k kpk1k−1 • Sum S = Si , pgf f (x) = E(xS )
  • 27. Generating functions Definition (Generating function) Let Pr(S = k) = pk. Then g(x) = E(xS ) = k pkxk is the probability generating function (pgf). Properties • Normalized g(1) = E(1S ) = k pk = 1 • Calculate mean k = g (1) • Sum S = Si , pgf f (x) = E(xS )
  • 28. Generating functions Definition (Generating function) Let Pr(S = k) = pk. Then g(x) = E(xS ) = k pkxk is the probability generating function (pgf). Properties • Normalized g(1) = E(1S ) = k pk = 1 • Calculate moment km = x ∂ ∂x m g x=1 • Sum S = Si , pgf f (x) = E(xS )
  • 29. Generating functions Definition (Generating function) Let Pr(S = k) = pk. Then g(x) = E(xS ) = k pkxk is the probability generating function (pgf). Properties • Normalized g(1) = E(1S ) = k pk = 1 • Calculate moment km = x ∂ ∂x m g x=1 • Sum S = Si , pgf f (x) = E(xS )
  • 30. Generating functions Definition (Generating function) Let Pr(S = k) = pk. Then g(x) = E(xS ) = k pkxk is the probability generating function (pgf). Properties • Normalized g(1) = E(1S ) = k pk = 1 • Calculate moment km = x ∂ ∂x m g x=1 • Sum S = Si , pgf f (x) = E(x Si )
  • 31. Generating functions Definition (Generating function) Let Pr(S = k) = pk. Then g(x) = E(xS ) = k pkxk is the probability generating function (pgf). Properties • Normalized g(1) = E(1S ) = k pk = 1 • Calculate moment km = x ∂ ∂x m g x=1 • Sum S = Si , pgf f (x) = E(xSi )
  • 32. Generating functions Definition (Generating function) Let Pr(S = k) = pk. Then g(x) = E(xS ) = k pkxk is the probability generating function (pgf). Properties • Normalized g(1) = E(1S ) = k pk = 1 • Calculate moment km = x ∂ ∂x m g x=1 • Sum S = Si , pgf f (x) = g(x)
  • 33. Generating functions Definition (Generating function) Let Pr(S = k) = pk. Then g(x) = E(xS ) = k pkxk is the probability generating function (pgf). Properties • Normalized g(1) = E(1S ) = k pk = 1 • Calculate moment km = x ∂ ∂x m g x=1 • Sum S = Si , pgf f (x) = g(x)m
  • 34. Degree generating function Example, ER degree distribution • Let pk be probability node has degree k. • Take pk = n k pk(1 − p)n−k (Erd¨os-R´enyi) • Then pgf g(x) = k n k pk(1 − p)n−kxk • Normalized g(1) = e k (1−1) = 1. • Mean g (x) = k e k (x−1). • Number of neighbours of m nodes g(x)m = em k (x−1).
  • 35. Degree generating function Example, ER degree distribution • Let pk be probability node has degree k. • Take pk = n k pk(1 − p)n−k (Erd¨os-R´enyi) • Then pgf g(x) = k n k pk(1 − p)n−kxk • Normalized g(1) = e k (1−1) = 1. • Mean g (x) = k e k (x−1). • Number of neighbours of m nodes g(x)m = em k (x−1).
  • 36. Degree generating function Example, ER degree distribution • Let pk be probability node has degree k. • Take pk = n k pk(1 − p)n−k (Erd¨os-R´enyi) • Then pgf g(x) = k n k (xp)k(1 − p)n−k (binomial theorem) • Normalized g(1) = e k (1−1) = 1. • Mean g (x) = k e k (x−1). • Number of neighbours of m nodes g(x)m = em k (x−1).
  • 37. Degree generating function Example, ER degree distribution • Let pk be probability node has degree k. • Take pk = n k pk(1 − p)n−k (Erd¨os-R´enyi) • Then pgf g(x) = (px + (1 − p))n • Normalized g(1) = e k (1−1) = 1. • Mean g (x) = k e k (x−1). • Number of neighbours of m nodes g(x)m = em k (x−1).
  • 38. Degree generating function Example, ER degree distribution • Let pk be probability node has degree k. • Take pk = n k pk(1 − p)n−k (Erd¨os-R´enyi) • Then pgf g(x) = (1 + p(x − 1))n (remember k = pn) • Normalized g(1) = e k (1−1) = 1. • Mean g (x) = k e k (x−1). • Number of neighbours of m nodes g(x)m = em k (x−1).
  • 39. Degree generating function Example, ER degree distribution • Let pk be probability node has degree k. • Take pk = n k pk(1 − p)n−k (Erd¨os-R´enyi) • Then pgf g(x) = (1 + k (x−1) n )n (limn→∞, def. exp) • Normalized g(1) = e k (1−1) = 1. • Mean g (x) = k e k (x−1). • Number of neighbours of m nodes g(x)m = em k (x−1).
  • 40. Degree generating function Example, ER degree distribution • Let pk be probability node has degree k. • Take pk = n k pk(1 − p)n−k (Erd¨os-R´enyi) • Then pgf g(x) = e k (x−1) • Normalized g(1) = e k (1−1) = 1. • Mean g (x) = k e k (x−1). • Number of neighbours of m nodes g(x)m = em k (x−1).
  • 41. Degree generating function Example, ER degree distribution • Let pk be probability node has degree k. • Take pk = n k pk(1 − p)n−k (Erd¨os-R´enyi) • Then pgf g(x) = e k (x−1) • Normalized g(1) = e k (1−1) = 1. • Mean g (x) = k e k (x−1). • Number of neighbours of m nodes g(x)m = em k (x−1).
  • 42. Degree generating function Example, ER degree distribution • Let pk be probability node has degree k. • Take pk = n k pk(1 − p)n−k (Erd¨os-R´enyi) • Then pgf g(x) = e k (x−1) • Normalized g(1) = e k (1−1) = 1. • Mean g (x) = k e k (x−1). • Number of neighbours of m nodes g(x)m = em k (x−1).
  • 43. Degree generating function Example, ER degree distribution • Let pk be probability node has degree k. • Take pk = n k pk(1 − p)n−k (Erd¨os-R´enyi) • Then pgf g(x) = e k (x−1) • Normalized g(1) = e k (1−1) = 1. • Mean g (1) = k . • Number of neighbours of m nodes g(x)m = em k (x−1).
  • 44. Degree generating function Example, ER degree distribution • Let pk be probability node has degree k. • Take pk = n k pk(1 − p)n−k (Erd¨os-R´enyi) • Then pgf g(x) = e k (x−1) • Normalized g(1) = e k (1−1) = 1. • Mean g (1) = k . • Number of neighbours of m nodes g(x)m = em k (x−1).
  • 45. Degree generating function Example, ER degree distribution • Let pk be probability node has degree k. • Take pk = n k pk(1 − p)n−k (Erd¨os-R´enyi) • Then pgf g(x) = e k (x−1) • Normalized g(1) = e k (1−1) = 1. • Mean g (1) = k . • Number of neighbours of m nodes (g(x)m) = m k em k (x−1).
  • 46. Degree generating function Example, ER degree distribution • Let pk be probability node has degree k. • Take pk = n k pk(1 − p)n−k (Erd¨os-R´enyi) • Then pgf g(x) = e k (x−1) • Normalized g(1) = e k (1−1) = 1. • Mean g (1) = k . • Number of neighbours of m nodes (g(1)m) = m k .
  • 47. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k kpk . • Average neighbour degree k kpk k . • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k k qkxk • Second neigbhours for m degree g1(x)m • Distribution of second neighbours m k pmp2(k|m)xk • Average number of second neighbours k2 − k .
  • 48. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k kpk . • Average neighbour degree k kpk k . • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k k qkxk • Second neigbhours for m degree g1(x)m • Distribution of second neighbours m k pmp2(k|m)xk • Average number of second neighbours k2 − k .
  • 49. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k kpk k . • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k k qkxk • Second neigbhours for m degree g1(x)m • Distribution of second neighbours m k pmp2(k|m)xk • Average number of second neighbours k2 − k .
  • 50. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k kpk k . • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k k qkxk • Second neigbhours for m degree g1(x)m • Distribution of second neighbours m k pmp2(k|m)xk • Average number of second neighbours k2 − k .
  • 51. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k2 k . • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k k qkxk • Second neigbhours for m degree g1(x)m • Distribution of second neighbours m k pmp2(k|m)xk • Average number of second neighbours k2 − k .
  • 52. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k2 k > k . • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k k qkxk • Second neigbhours for m degree g1(x)m • Distribution of second neighbours m k pmp2(k|m)xk • Average number of second neighbours k2 − k .
  • 53. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k2 k − k > 0. • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k k qkxk • Second neigbhours for m degree g1(x)m • Distribution of second neighbours m k pmp2(k|m)xk • Average number of second neighbours k2 − k .
  • 54. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k2 − k 2 k > 0. • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k k qkxk • Second neigbhours for m degree g1(x)m • Distribution of second neighbours m k pmp2(k|m)xk • Average number of second neighbours k2 − k .
  • 55. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k2 − k 2 k > 0 (friendship paradox). • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k k qkxk • Second neigbhours for m degree g1(x)m • Distribution of second neighbours m k pmp2(k|m)xk • Average number of second neighbours k2 − k .
  • 56. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k2 − k 2 k > 0 (friendship paradox). • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k k qkxk • Second neigbhours for m degree g1(x)m • Distribution of second neighbours m k pmp2(k|m)xk • Average number of second neighbours k2 − k .
  • 57. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k2 − k 2 k > 0 (friendship paradox). • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k k qkxk • Second neigbhours for m degree g1(x)m • Distribution of second neighbours m k pmp2(k|m)xk • Average number of second neighbours k2 − k .
  • 58. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k2 − k 2 k > 0 (friendship paradox). • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k k(k + 1)pk+1xk • Second neigbhours for m degree g1(x)m • Distribution of second neighbours m k pmp2(k|m)xk • Average number of second neighbours k2 − k .
  • 59. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k2 − k 2 k > 0 (friendship paradox). • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k k kpkxk−1 • Second neigbhours for m degree g1(x)m • Distribution of second neighbours m k pmp2(k|m)xk • Average number of second neighbours k2 − k .
  • 60. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k2 − k 2 k > 0 (friendship paradox). • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k g (x) • Second neigbhours for m degree g1(x)m • Distribution of second neighbours m k pmp2(k|m)xk • Average number of second neighbours k2 − k .
  • 61. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k2 − k 2 k > 0 (friendship paradox). • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k g (x) • Second neigbhours for m degree g1(x)m • Distribution of second neighbours m k pmp2(k|m)xk • Average number of second neighbours k2 − k .
  • 62. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k2 − k 2 k > 0 (friendship paradox). • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k g (x) • Second neigbhours for m degree g1(x)m = k p2(k|m)xk • Distribution of second neighbours m k pmp2(k|m)xk • Average number of second neighbours k2 − k .
  • 63. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k2 − k 2 k > 0 (friendship paradox). • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k g (x) • Second neigbhours for m degree g1(x)m = k p2(k|m)xk • Distribution of second neighbours m k pmp2(k|m)xk • Average number of second neighbours k2 − k .
  • 64. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k2 − k 2 k > 0 (friendship paradox). • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k g (x) • Second neigbhours for m degree g1(x)m = k p2(k|m)xk • Distribution of second neighbours m pm k p2(k|m)xk • Average number of second neighbours k2 − k .
  • 65. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k2 − k 2 k > 0 (friendship paradox). • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k g (x) • Second neigbhours for m degree g1(x)m = k p2(k|m)xk • Distribution of second neighbours m pmg1(x)m • Average number of second neighbours k2 − k .
  • 66. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k2 − k 2 k > 0 (friendship paradox). • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k g (x) • Second neigbhours for m degree g1(x)m = k p2(k|m)xk • Distribution of second neighbours g(g1(x)) • Average number of second neighbours k2 − k .
  • 67. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k2 − k 2 k > 0 (friendship paradox). • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k g (x) • Second neigbhours for m degree g1(x)m = k p2(k|m)xk • Distribution of third neighbours g(g1(g1(x))) • Average number of second neighbours k2 − k .
  • 68. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k2 − k 2 k > 0 (friendship paradox). • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k g (x) • Second neigbhours for m degree g1(x)m = k p2(k|m)xk • Distribution of d neighbours g(g1(· · · g1(x) · · · )) • Average number of second neighbours k2 − k .
  • 69. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k2 − k 2 k > 0 (friendship paradox). • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k g (x) • Second neigbhours for m degree g1(x)m = k p2(k|m)xk • Distribution of d neighbours g(g1(· · · g1(x) · · · )) • Average number of second neighbours k2 − k .
  • 70. Giant component Giant component (GC) • Always only one GC (and lots of small ones). • Probability link does not connect node to GC u. • Probability node of degree k not in GC uk • Probability node not in giant component k pkuk = g(u) • Size of giant component: S = 1 − g(u). But what is u? Self consistency • Probability link not connects to GC is u. • Connects to node with k other neighbours: excess degree. • Average probability: k qkuk = g1(u).
  • 71. Giant component Giant component (GC) • Always only one GC (and lots of small ones). • Probability link does not connect node to GC u. • Probability node of degree k not in GC uk • Probability node not in giant component k pkuk = g(u) • Size of giant component: S = 1 − g(u). But what is u? Self consistency • Probability link not connects to GC is u. • Connects to node with k other neighbours: excess degree. • Average probability: k qkuk = g1(u).
  • 72. Giant component Giant component (GC) • Always only one GC (and lots of small ones). • Probability link does not connect node to GC u. • Probability node of degree k not in GC uk • Probability node not in giant component k pkuk = g(u) • Size of giant component: S = 1 − g(u). But what is u? Self consistency • Probability link not connects to GC is u. • Connects to node with k other neighbours: excess degree. • Average probability: k qkuk = g1(u).
  • 73. Giant component Giant component (GC) • Always only one GC (and lots of small ones). • Probability link does not connect node to GC u. • Probability node of degree k not in GC uk • Probability node not in giant component k pkuk = g(u) • Size of giant component: S = 1 − g(u). But what is u? Self consistency • Probability link not connects to GC is u. • Connects to node with k other neighbours: excess degree. • Average probability: k qkuk = g1(u).
  • 74. Giant component Giant component (GC) • Always only one GC (and lots of small ones). • Probability link does not connect node to GC u. • Probability node of degree k not in GC uk • Probability node not in giant component k pkuk = g(u) • Size of giant component: S = 1 − g(u). But what is u? Self consistency • Probability link not connects to GC is u. • Connects to node with k other neighbours: excess degree. • Average probability: k qkuk = g1(u).
  • 75. Giant component Giant component (GC) • Always only one GC (and lots of small ones). • Probability link does not connect node to GC u. • Probability node of degree k not in GC uk • Probability node not in giant component k pkuk = g(u) • Size of giant component: S = 1 − g(u). But what is u? Self consistency • Probability link not connects to GC is u. • Connects to node with k other neighbours: excess degree. • Average probability: k qkuk = g1(u).
  • 76. Giant component Giant component (GC) • Always only one GC (and lots of small ones). • Probability link does not connect node to GC u. • Probability node of degree k not in GC uk • Probability node not in giant component k pkuk = g(u) • Size of giant component: S = 1 − g(u). But what is u? Self consistency • Probability link not connects to GC is u. • Connects to node with k other neighbours: excess degree. • Average probability: k qkuk = g1(u).
  • 77. Giant component Giant component (GC) • Always only one GC (and lots of small ones). • Probability link does not connect node to GC u. • Probability node of degree k not in GC uk • Probability node not in giant component k pkuk = g(u) • Size of giant component: S = 1 − g(u). But what is u? Self consistency • Probability link not connects to GC is u. • Connects to node with k other neighbours: excess degree. • Average probability: k qkuk = g1(u) = u.
  • 78. Giant component How to solve g1(u) = u? 0 0.2 0.4 0.6 0.8 1 0 0.2 0.4 0.6 0.8 1 g1(u) u = g1(u) u
  • 79. Giant component If derivative g1(1) > 1 giant component appears. 0 0.2 0.4 0.6 0.8 1 0 0.2 0.4 0.6 0.8 1 g (1) u
  • 80. Giant component • GC appears when 1 < g1(1). • For ER graphs k2 − k = k 2, so k > 1 the GC appears. 0 1 2 3 4 5 6 0 0.5 1 k S • For scale free graphs k2 → ∞ , so always GC (if 2 < α < 3).
  • 81. Giant component • GC appears when 1 < g1(1) = k kqk. • For ER graphs k2 − k = k 2, so k > 1 the GC appears. 0 1 2 3 4 5 6 0 0.5 1 k S • For scale free graphs k2 → ∞ , so always GC (if 2 < α < 3).
  • 82. Giant component • GC appears when 1 < g1(1) = 1 k k k(k + 1)pk+1. • For ER graphs k2 − k = k 2, so k > 1 the GC appears. 0 1 2 3 4 5 6 0 0.5 1 k S • For scale free graphs k2 → ∞ , so always GC (if 2 < α < 3).
  • 83. Giant component • GC appears when 1 < g1(1) = 1 k k(k − 1)kpk. • For ER graphs k2 − k = k 2, so k > 1 the GC appears. 0 1 2 3 4 5 6 0 0.5 1 k S • For scale free graphs k2 → ∞ , so always GC (if 2 < α < 3).
  • 84. Giant component • GC appears when 1 < g1(1) = 1 k k k2pk − kpk. • For ER graphs k2 − k = k 2, so k > 1 the GC appears. 0 1 2 3 4 5 6 0 0.5 1 k S • For scale free graphs k2 → ∞ , so always GC (if 2 < α < 3).
  • 85. Giant component • GC appears when 1 < g1(1) = k2 − k k . • For ER graphs k2 − k = k 2, so k > 1 the GC appears. 0 1 2 3 4 5 6 0 0.5 1 k S • For scale free graphs k2 → ∞ , so always GC (if 2 < α < 3).
  • 86. Giant component • GC appears when 1 < g1(1) = k2 − k k . • For ER graphs k2 − k = k 2, so k > 1 the GC appears. 0 1 2 3 4 5 6 0 0.5 1 k S • For scale free graphs k2 → ∞ , so always GC (if 2 < α < 3).
  • 87. Giant component • GC appears when 1 < g1(1) = k2 − k k . • For ER graphs k2 − k = k 2, so k > 1 the GC appears. 0 1 2 3 4 5 6 0 0.5 1 k S • For scale free graphs k2 → ∞ , so always GC (if 2 < α < 3).
  • 88. Giant component • GC appears when 1 < g1(1) = k2 − k k . • For ER graphs k2 − k = k 2, so k > 1 the GC appears. 0 1 2 3 4 5 6 0 0.5 1 k S • For scale free graphs k2 → ∞ , so always GC (if 2 < α < 3).
  • 89. Node failure How fast is giant component destroyed if nodes are removed? Same approach • Probability φ node does not “fail”. • Again u probability link does not connect to GC. Self consistency • I: Neighbour is removed (1 − φ). • II: Neighbour is not removed (φ), but not in GC (uk). • So, probability is 1 − φ + φuk. • On average k qk(1 − φ + φuk). • Solve for u gives solution.
  • 90. Node failure How fast is giant component destroyed if nodes are removed? Same approach • Probability φ node “functions”. • Again u probability link does not connect to GC. Self consistency • I: Neighbour is removed (1 − φ). • II: Neighbour is not removed (φ), but not in GC (uk). • So, probability is 1 − φ + φuk. • On average k qk(1 − φ + φuk). • Solve for u gives solution.
  • 91. Node failure How fast is giant component destroyed if nodes are removed? Same approach • Probability φ node not removed from network. • Again u probability link does not connect to GC. Self consistency • I: Neighbour is removed (1 − φ). • II: Neighbour is not removed (φ), but not in GC (uk). • So, probability is 1 − φ + φuk. • On average k qk(1 − φ + φuk). • Solve for u gives solution.
  • 92. Node failure How fast is giant component destroyed if nodes are removed? Same approach • Probability φ node not removed from network. • Again u probability link does not connect to GC. Self consistency • I: Neighbour is removed (1 − φ). • II: Neighbour is not removed (φ), but not in GC (uk). • So, probability is 1 − φ + φuk. • On average k qk(1 − φ + φuk). • Solve for u gives solution.
  • 93. Node failure How fast is giant component destroyed if nodes are removed? Same approach • Probability φ node not removed from network. • Again u probability link does not connect to GC. Self consistency • I: Neighbour is removed (1 − φ). • II: Neighbour is not removed (φ), but not in GC (uk). • So, probability is 1 − φ + φuk. • On average k qk(1 − φ + φuk). • Solve for u gives solution.
  • 94. Node failure How fast is giant component destroyed if nodes are removed? Same approach • Probability φ node not removed from network. • Again u probability link does not connect to GC. Self consistency • I: Neighbour is removed (1 − φ). • II: Neighbour is not removed (φ), but not in GC (uk). • So, probability is 1 − φ + φuk. • On average k qk(1 − φ + φuk). • Solve for u gives solution.
  • 95. Node failure How fast is giant component destroyed if nodes are removed? Same approach • Probability φ node not removed from network. • Again u probability link does not connect to GC. Self consistency • I: Neighbour is removed (1 − φ). • II: Neighbour is not removed (φ), but not in GC (uk). • So, probability is 1 − φ + φuk. • On average k qk(1 − φ + φuk). • Solve for u gives solution.
  • 96. Node failure How fast is giant component destroyed if nodes are removed? Same approach • Probability φ node not removed from network. • Again u probability link does not connect to GC. Self consistency • I: Neighbour is removed (1 − φ). • II: Neighbour is not removed (φ), but not in GC (uk). • So, probability is 1 − φ + φuk. • On average 1 − φ + φ k qkuk. • Solve for u gives solution.
  • 97. Node failure How fast is giant component destroyed if nodes are removed? Same approach • Probability φ node not removed from network. • Again u probability link does not connect to GC. Self consistency • I: Neighbour is removed (1 − φ). • II: Neighbour is not removed (φ), but not in GC (uk). • So, probability is 1 − φ + φuk. • On average 1 − φ + φg1(u). • Solve for u gives solution.
  • 98. Node failure How fast is giant component destroyed if nodes are removed? Same approach • Probability φ node not removed from network. • Again u probability link does not connect to GC. Self consistency • I: Neighbour is removed (1 − φ). • II: Neighbour is not removed (φ), but not in GC (uk). • So, probability is 1 − φ + φuk. • On average 1 − φ + φg1(u) = u. • Solve for u gives solution.
  • 99. Node failure How fast is giant component destroyed if nodes are removed? Same approach • Probability φ node not removed from network. • Again u probability link does not connect to GC. Self consistency • I: Neighbour is removed (1 − φ). • II: Neighbour is not removed (φ), but not in GC (uk). • So, probability is 1 − φ + φuk. • On average 1 − φ + φg1(u) = u. • Solve for u gives solution.
  • 100. Node failure • Again, solving u = 1 − φ + φg1(u) not easy. • But if ∂ ∂u 1 − φ + φg1(u) > 1 GC exists. • For ER φc = 1/ k , for scale free φc = 0. 0 0.2 0.4 0.6 0.8 1 0 0.2 0.4 0.6 0.8 1 ER Scale Free φ S
  • 101. Node failure • Again, solving u = 1 − φ + φg1(u) not easy. • But if ∂ ∂u 1 − φ + φg1(u) > 1 GC exists. • For ER φc = 1/ k , for scale free φc = 0. 0 0.2 0.4 0.6 0.8 1 0 0.2 0.4 0.6 0.8 1 ER Scale Free φ S
  • 102. Node failure • Again, solving u = 1 − φ + φg1(u) not easy. • But if φg1(u) > 1 GC exists. • For ER φc = 1/ k , for scale free φc = 0. 0 0.2 0.4 0.6 0.8 1 0 0.2 0.4 0.6 0.8 1 ER Scale Free φ S
  • 103. Node failure • Again, solving u = 1 − φ + φg1(u) not easy. • But if φ > 1 g1(u) GC exists. • For ER φc = 1/ k , for scale free φc = 0. 0 0.2 0.4 0.6 0.8 1 0 0.2 0.4 0.6 0.8 1 ER Scale Free φ S
  • 104. Node failure • Again, solving u = 1 − φ + φg1(u) not easy. • But if φ > 1 g1(u) = φc GC exists. • For ER φc = 1/ k , for scale free φc = 0. 0 0.2 0.4 0.6 0.8 1 0 0.2 0.4 0.6 0.8 1 ER Scale Free φ S
  • 105. Node failure • Again, solving u = 1 − φ + φg1(u) not easy. • But if φ > 1 g1(u) = φc GC exists. • For ER φc = 1/ k , for scale free φc = 0. 0 0.2 0.4 0.6 0.8 1 0 0.2 0.4 0.6 0.8 1 ER Scale Free φ S
  • 106. Node failure • Again, solving u = 1 − φ + φg1(u) not easy. • But if φ > 1 g1(u) = φc GC exists. • For ER φc = 1/ k , for scale free φc = 0. 0 0.2 0.4 0.6 0.8 1 0 0.2 0.4 0.6 0.8 1 ER Scale Free φ S 0 0.2
  • 107. Node attack What if we attack specific nodes? Same approach • Probability φk node of degree k does not “fail”. • On average φ = k φkpk. • Again u probability link does not connect to GC. Self consistency • I: Neighbour is removed (1 − φk). • II: Neighbour is not removed (φk), but not in GC (uk−1). • So on average u = k qk−1(1 − φk + φkuk−1). • Define f (u) = k φkqk−1uk−1. • Then u = 1 − f (1) + f (u), solve for u gives solution.
  • 108. Node attack What if we attack specific nodes? Same approach • Probability φk node of degree k does not “fail”. • On average φ = k φkpk. • Again u probability link does not connect to GC. Self consistency • I: Neighbour is removed (1 − φk). • II: Neighbour is not removed (φk), but not in GC (uk−1). • So on average u = k qk−1(1 − φk + φkuk−1). • Define f (u) = k φkqk−1uk−1. • Then u = 1 − f (1) + f (u), solve for u gives solution.
  • 109. Node attack What if we attack specific nodes? Same approach • Probability φk node of degree k does not “fail”. • On average φ = k φkpk. • Again u probability link does not connect to GC. Self consistency • I: Neighbour is removed (1 − φk). • II: Neighbour is not removed (φk), but not in GC (uk−1). • So on average u = k qk−1(1 − φk + φkuk−1). • Define f (u) = k φkqk−1uk−1. • Then u = 1 − f (1) + f (u), solve for u gives solution.
  • 110. Node attack What if we attack specific nodes? Same approach • Probability φk node of degree k does not “fail”. • On average φ = k φkpk. • Again u probability link does not connect to GC. Self consistency • I: Neighbour is removed (1 − φk). • II: Neighbour is not removed (φk), but not in GC (uk−1). • So on average u = k qk−1(1 − φk + φkuk−1). • Define f (u) = k φkqk−1uk−1. • Then u = 1 − f (1) + f (u), solve for u gives solution.
  • 111. Node attack What if we attack specific nodes? Same approach • Probability φk node of degree k does not “fail”. • On average φ = k φkpk. • Again u probability link does not connect to GC. Self consistency • I: Neighbour is removed (1 − φk). • II: Neighbour is not removed (φk), but not in GC (uk−1). • So on average u = k qk−1(1 − φk + φkuk−1). • Define f (u) = k φkqk−1uk−1. • Then u = 1 − f (1) + f (u), solve for u gives solution.
  • 112. Node attack What if we attack specific nodes? Same approach • Probability φk node of degree k does not “fail”. • On average φ = k φkpk. • Again u probability link does not connect to GC. Self consistency • I: Neighbour is removed (1 − φk). • II: Neighbour is not removed (φk), but not in GC (uk−1). • So on average u = k qk−1(1 − φk + φkuk−1). • Define f (u) = k φkqk−1uk−1. • Then u = 1 − f (1) + f (u), solve for u gives solution.
  • 113. Node attack What if we attack specific nodes? Same approach • Probability φk node of degree k does not “fail”. • On average φ = k φkpk. • Again u probability link does not connect to GC. Self consistency • I: Neighbour is removed (1 − φk). • II: Neighbour is not removed (φk), but not in GC (uk−1). • So on average u = k qk−1(1 − φk + φkuk−1). • Define f (u) = k φkqk−1uk−1. • Then u = 1 − f (1) + f (u), solve for u gives solution.
  • 115. Epidemics Disease spreading • Standard models: Susceptable, Infected, Recovered. • SIR: transmission rate β, recovery rate ν. • Infect neighbour with probability φ = 1 − eβτ • How far will it spread: giant component. Percolation • I: Disease not transmitted (1 − φ). • II: Disease transmitted (φ), but not to GC (uk). • Already solved: critical φc = 1 g1(u) . • Epidemiological threshold βτ = log k2 − k k2 −2 k
  • 116. Epidemics Disease spreading • Standard models: Susceptable, Infected, Recovered. • SIR: transmission rate β, infectious time τ = 1/ν. • Infect neighbour with probability φ = 1 − eβτ • How far will it spread: giant component. Percolation • I: Disease not transmitted (1 − φ). • II: Disease transmitted (φ), but not to GC (uk). • Already solved: critical φc = 1 g1(u) . • Epidemiological threshold βτ = log k2 − k k2 −2 k
  • 117. Epidemics Disease spreading • Standard models: Susceptable, Infected, Recovered. • SIR: transmission rate β, infectious time τ = 1/ν. • Infect neighbour with probability φ = 1 − eβτ • How far will it spread: giant component. Percolation • I: Disease not transmitted (1 − φ). • II: Disease transmitted (φ), but not to GC (uk). • Already solved: critical φc = 1 g1(u) . • Epidemiological threshold βτ = log k2 − k k2 −2 k
  • 118. Epidemics Disease spreading • Standard models: Susceptable, Infected, Recovered. • SIR: transmission rate β, infectious time τ = 1/ν. • Infect neighbour with probability φ = 1 − eβτ • How far will it spread: giant component. Percolation • I: Disease not transmitted (1 − φ). • II: Disease transmitted (φ), but not to GC (uk). • Already solved: critical φc = 1 g1(u) . • Epidemiological threshold βτ = log k2 − k k2 −2 k
  • 119. Epidemics Disease spreading • Standard models: Susceptable, Infected, Recovered. • SIR: transmission rate β, infectious time τ = 1/ν. • Infect neighbour with probability φ = 1 − eβτ • How far will it spread: giant component. Percolation • I: Disease not transmitted (1 − φ). • II: Disease transmitted (φ), but not to GC (uk). • Already solved: critical φc = k k2 − k . • Epidemiological threshold βτ = log k2 − k k2 −2 k
  • 120. Epidemics Disease spreading • Standard models: Susceptable, Infected, Recovered. • SIR: transmission rate β, infectious time τ = 1/ν. • Infect neighbour with probability φ = 1 − eβτ • How far will it spread: giant component. Percolation • I: Disease not transmitted (1 − φ). • II: Disease transmitted (φ), but not to GC (uk). • Already solved: critical φc = k k2 − k . • Epidemiological threshold βτ = log k2 − k k2 −2 k
  • 121. Epidemics Epidemic threshold • For ER, threshold βτ = log k k −1. • For scale free, k2 diverges: always epidemic outbreak. 0 0.2 0.4 0.6 0.8 1 0 0.5 1 ER Scale Free φ S
  • 122. Conclusions Models • Short pats & clustering: small world model • Scale free: preferential attachment • Many other mechanisms: e.g. triadic closure, homophily, etc. . . • Focus on stylistic features. Analysis • Scale-free networks robust, spread fast, but vulnerable for attack. • Generating functions greatly help analysis. • Compare observed network to random/model. How does it deviate? Questions?