SlideShare a Scribd company logo
1 of 17
Notes on Popularity versus Similarity Network Model
Peiyuan Sun
April 13, 2017
Model 0. PA with initial attractiveness
• reference
Structure of Growing Networks with Preferential Linking
S.N.Dorogovtsev, J.F.F.Mendes, and A.N.Samukhin
Physical Review Letters, 2000
• generative process:
1. at equally discrete time step, a new site appears,
2. simultaneously, m new directed links coming out from nonspecified sites are in-
troduced,
3. the m new links are distributed proportional to the attractiveness of node s:
As = A + qs, where A is each node’s initial attractiveness and qs incoming degree
of node s.
• Master Equation
P(q, s, t + 1) =
m
l=0
P(ml)
s P(q, s, t + 1)
=
m
l=0
m
l
q − l + am
(1 + a)mt
l
1 −
q − l + am
(1 + a)mt
m−l
p(q − l, s, t)
(1)
P(q, s, s) = δ(q) (2)
connectivity distribution of the entire networks:
P(q, t) =
t
u=1
P(q, s, t)/t (3)
1
summing up E.q 1 over s from 1 to t
t
s=1
P(q, s, t + 1) =
t
s=1
m
l=0
P(ml)
s P(q − l, s, t + 1)
=
t
s=1
m
l=0
m
l
q − l + am
(1 + a)mt
l
1 −
q − l + am
(1 + a)mt
m−l
P(q − l, s, t)
=
m
l=0
m
l
q − l + am
(1 + a)mt
l
1 −
q − l + am
(1 + a)mt
m−l t
s=1
P(q − l, s, t)
=
m
0
1 −
q − l + am
(1 + a)mt
m
tP(q, t)
+
m
1
q − l + am
(1 + a)mt
1 −
q − l + am
(1 + a)mt
m−1
tP(q − 1, t)
= t 1 −
q − l + am
(1 + a)mt
m
P(q, t)
+ mt
q − l + am
(1 + a)mt
1 −
q − l + am
(1 + a)mt
m−1
P(q − 1, t)
= t −
q − l + am
1 + a
P(q, t) +
q − l + am
1 + a
P(q − 1, t) + O(
P
t
)
(4)
t
s=1
P(q, s, t + 1) =
t+1
s=1
P(q, s, t + 1) − P(q, t + 1, t + 1)
= (t + 1)P(q, t + 1) − P(q, t + 1, t + 1)
(5)
(t + 1)P(q, t + 1) − P(q, t + 1, t + 1) = t −
q − l + am
1 + a
P(q, t)
+
q − l + am
1 + a
P(q − 1, t) + O(
P
t
)
(6)
(1 + a)t (P(q, t + 1) − P(q, t)) + (1 + a)P(q, t + 1) + (q + am)P(q, t) − (q − 1 + am)P(q − 1, t)
= (1 + a)δ(q)
(7)
at long times t 1:
(1 + a)t
∂P(q, t)
∂t
+ (1 + a)P(q, t) + (q + am)P(q, t) − (q − 1 + am)P(q − 1, t) = (1 + a)δ(q)
(8)
assume the limit P(q) = P(q, t → ∞) exists:
(1 + a)P(q) + (q + am)P(q) − (q − 1 + am)P(q − 1) = (1 + a)δ(q) (9)
2
assume the generating function:
Φ(z) =
∞
q=0
P(q)zq
(10)
we first derive some trivial cases for the generating function:
Z[P(q)] =
∞
q=0
P(q)zq
= Φ(z)
Z[qP(q)] =
∞
q=0
qP(q)zq
= z
∞
q=0
qP(q)zq−1
= z
dΦ
dz
Z[qP(q − 1)] =
∞
q=1
qP(q − 1)zq
= z2
∞
q=1
(q − 1)P(q − 1)zq−2
+ z
∞
q=1
P(q − 1)zq−1
= z2
∞
q =0
q P(q )zq −1
+ z
∞
q =0
P(q )zq
= z2 dΦ
dz
+ zΦ
Z[P(q − 1)] = z
∞
q=1
P(q − 1)zq−1
= z
∞
q =0
P(q )zq
= zΦ
(11)
then E.q 9 transforms to the following differential equation:
(1 + a)Φ + z
dΦ
dz
+ maΦ − z2 dΦ
dz
− zΦ − (ma − 1)zΦ = 1 + a (12)
after some simple algebra operations:
z(1 − z)
dΦ
dz
+ ma(1 − z)Φ + (1 + a)Φ = 1 + a (13)
this is a first order nonhomogeneous linear differential equation with following formula:
dy
dx
+ P(x)y = Q(x) (14)
and the general solution to this equation has the following forms:
y = Ce− P(x)dx
+ e− P(x)dx
Q(x)e− P(x)dx
dx (15)
3
comparing E.q 12 with E.q 14:
P(z) =
ma(1 − z) + 1 + a
z(1 − z)
Q(z) =
1 + a
z(1 − z)
(16)
do some simple integration operations:
P(z)dz = [(m + 1)a + 1] lnz − (1 + a)ln(1 − z)
Q(z)e P(z)dz
dz = (1 + a)
z
0
x(m+1)a
(1 − x)a+2
dx
e− P(z)dz
=
(1 − z)1+a
z(m+1)a+1
(17)
then we get the general formula of Φ:
Φ = Cz−1−(m+1)a
(1 − z)1+a
+ (1 + a)z−1−(m+1)a
(1 − z)1+a
z
0
dx
x(m+1)a
(1 − x)2+a
(18)
from E.q 17, we can get that the constant term C = 0, then:
Φ = (1 + a)z−1−(m+1)a
(1 − z)1+a
z
0
dx
x(m+1)a
(1 − x)2+a
=
1 + a
1 + (1 + m)a2
F1[1, ma; 2 + (m + 1)a; z]
(19)
using hypergeometric function’s expansion in z and comparing with the z’s generating
function:
P(q) = (1 + a)
Γ [(m + 1)a + 1]
Γ(ma)
Γ(q + ma)
q + 2 + (m + 1)a
(20)
E.q 20 is the 1st main result which is the analytic solution of the degree distribution
of the growing network. We then discuss 2 special cases:
1. when a = 1, which corresponding to the case As = m + qs
P(q) =
2m(m + 1)
(q + m)(q + m + 1)(q + m + 2)
(21)
2. when ma + q 1
P(q) ≈ (1 + a)
Γ [(m + 1)a + 1]
Γ(ma)
(q + ma)−(2+a)
= (1 + a)
Γ [(m + 1)a + 1]
Γ(ma)
(q + ma)−γ
(22)
with γ = 2 + a = 2 + A
m
4
then let us derive the distribution of P(q, s, t), for t 1 we expand the Master Equation
to 2nd order:
P(q, s, t + 1) = 1 −
q + am
(1 + a)t
P(q, s, t) +
q − 1 + am
(1 + a)t
P(q − 1, s, t) + O(
p
t2
) (23)
(1 + a)tP(q, s, t + 1) = [(1 + a)t − (q + am)] P(q, s, t) + (q − 1 + am)P(q − 1, s, t) + O(
p
t
)
(24)
(1 + a)tP(q, s, t + 1) − (1 + a)tP(q, s, t) = (q − 1 + am)P(q − 1, s, t) − (q + am)P(q, s, t)
(25)
replace the difference with derivative for large t:
(1 + a)t
∂p
∂t
(q, s, t) = (q − 1 + am)P(q − 1, s, t) − (q + am)P(q, s, t) (26)
define the generating function G(x, t) by means of:
G(x, t) =
+∞
q=0
P(q, s, t)xq
(27)
5
then we begin by taking the time derivative:
∂G(x, t)
∂t
=
+∞
q=0
∂P(q, s, t)
∂t
xq
=
+∞
q=0
[
q
(1 + a)t
P(q − 1, s, t) +
am − 1
(1 + a)t
P(q − 1, s, t)
−
q
(1 + a)t
P(q, s, t) −
am
(1 + a)t
P(q, s, t)]xq
=
1
(1 + a)t
+∞
q=1
qP(q − 1, s, t)xq
+
am − 1
(1 + a)t
+∞
q=1
P(q − 1, s, t)xq
−
1
(1 + a)t
+∞
q=0
qP(q, s, t)xq
−
am
(1 + a)t
+∞
q=0
P(q, s, t)xq
=
1
(1 + a)t
+∞
q=1
(q − 1)P(q − 1, s, t)xq
+
1
(1 + a)t
+∞
q=1
P(q − 1, s, t)xq
+
(am − 1)x
(1 + a)t
+∞
q=1
P(q − 1, s, t)xq−1
−
x
(1 + a)t
+∞
q=0
qP(q, s, t)xq−1
−
am
(1 + a)t
+∞
q=0
P(q, s, t)xq
=
1
(1 + a)t
+∞
q=0
qP(q, s, t)xq+1
+
1
(1 + a)t
+∞
q=0
P(q, s, t)xq+1
+
(am − 1)x
(1 + a)t
+∞
q=0
P(q, s, t)xq
−
x
(1 + a)t
+∞
q=0
qP(q, s, t)xq−1
−
am
(1 + a)t
+∞
q=0
P(q, s, t)xq
=
x2
(1 + a)t
+∞
q=0
qP(q, s, t)xq−1
+
x
(1 + a)t
+∞
q=0
P(q, s, t)xq
+
(am − 1)x
(1 + a)t
+∞
q=0
P(q, s, t)xq
−
x
(1 + a)t
+∞
q=0
qP(q, s, t)xq−1
−
am
(1 + a)t
+∞
q=0
P(q, s, t)xq
=
x2
(1 + a)t
∂G(x, t)
∂x
+
x
(1 + a)t
G(x, t) +
(am − 1)x
(1 + a)t
G(x, t)
−
x
(1 + a)t
∂G(x, t)
∂x
−
am
(1 + a)t
G(x, t)
=
x(x − 1)
(1 + a)t
∂G(x, t)
∂x
+
am(x − 1)
(1 + a)t
G(x, t)
(28)
6
then we get the partial differential equation for the generating function G(x, t):
∂G(x, t)
∂t
=
x(x − 1)
(1 + a)t
∂G(x, t)
∂x
+
am(x − 1)
(1 + a)t
G(x, t) (29)
with initial condition:
G(x, s) =
+∞
q=0
P(q, s, s)xq
=
+∞
q=0
δ(q)xq
= 1 (30)
To get the analytic solution of E.q 29, we resort to the method of characteristics.
By the following characteristic equation:
dx
−x(x−1)
(1+a)t
=
dt
1
=
dG
am(x−1)
(1+a)t
G(x, t)
(31)
which is equivalent to:
dx
x(x − 1)
=
dt
(1 + a)t
=
dG
am(x − 1)G
(32)
we consider the following two pairs:
dx
x(x − 1)
=
dt
(1 + a)t
(33)
dx
x(x − 1)
=
dG
am(x − 1)G
(34)
do integration on both sides of E.q 33 and E.q 34:
x
x − 1
t− 1
1+a = C1 (35)
G(x, t) =
1
x
C2
(36)
we could view C2 as a function of C1, since C1 = x
x−1
t− 1
1+a , we can write the general
solution of the differential equation as:
G(x, t) =
1
f( x
x−1
t− 1
1+a )
1
x
(37)
plug in the initial condition:
f(
x
x − 1
t− 1
1+a ) =
1
x
(38)
replace x
x−1
t− 1
1+a with r:
f(r) =
r − s− 1
1+a
r
(39)
7
insert f( x
x−1
t− 1
1+a ) into 37, we get the analytic solution of the partial differential equa-
tion E.q 29:
G(x, t) = 1 −
s
t
− 1
1+a
x +
s
t
− 1
1+a
−am
(40)
to get the analytic result of P(q, s, t), we use Taylor expansion of E.q 40. The coefficient
of xq
is:
∂G(n)
(x = 0, t)
∂x
1
q!
=
(−am)(−am − 1) · · · (−am − q + 1)
q!
1 −
s
t
− 1
1+a
q
s
t
amq
1+a
=
(−1)q
am(am + 1) · · · (am + q − 1)
q!
(−1)q
1 −
s
t
1
1+a
q
s
t
am
1+a
=
am(am + 1) · · · (am + q − 1))
q!
1 −
s
t
1
1+a
q
s
t
am
1+a
=
Γ(am + q)
Γ(am)q!
1 −
s
t
1
1+a
q
s
t
am
1+a
(41)
then:
P(q, s, t) =
Γ(am + q)
Γ(am)q!
1 −
s
t
1
1+a
q
s
t
am
1+a
(42)
the average connectivity of a given site is:
∂G(x = 1, t)
∂x
=
+∞
q=0
qP(q, s, t)xq−1
x=1
= am
s
t
− 1
1+a
− 1
(43)
¯q(s, t) = am
s
t
− 1
1+a
− 1 (44)
let β = 1/(1 + a) and comparing with γ = 2 + a, we get the relation between single
site’s age and the whole network:
β(γ − 1) = 1 (45)
8
Model 1. Popularity versus Similarity (m closest nodes)
• reference
Popularity versus Similarity in growing networks
Fragkiskos Papadopoulos, Maksim Kitsak, M. ´Angeles Serrano
Mari´an Bogu˜n´a & Dimitri Krioukov
Nature, 2012
• generative process:
1. initially the network is empty;
2. at time t ≥ 1, new node t appears having coordinates (rt, θt), where rt = lnt,
while θt is uniformly distributed on [0, 2π], and every existing node s, s < t,
moves increasing its radial coordinate according to rs(t) = βrs + (1 − β)rt with
parameter β ∈ [0, 1];
3. node t connects to the m hyperbolically closest nodes s, s < t; at early times
t ≤ m, node t connects to all the existing nodes. The hyperbolic distance between
two points (rs, θs) and (rt, θt) is given by:
xst =
1
2
arccosh(cosh2rscosh2rt − sinh2rssinh2rtcosΘst)
≈ rs + rt + ln(θst/2), whereθst = π − |π − |θs − θt||
(46)
• Degree Distribution
we first derive the probability that node s attracts a link from the m new links intro-
duced by node t:
1. Since the generative process assume that each new node connects to the hyper-
bolically m closest existing nodes. The only requirement that need to meet is that
the hyperbolic distance xst is less than the largest hyperbolic distance between
node t and its m closest neighbors. We could image this area as a disc centered
at node t. Then:
(a) the radius of this disc Rt must satisfy the following condition:
¯N(Rt) =
t
1
p(s, t)ds = m (47)
(b) the connection probability between node s and t:
p(s, t) = p(xst < Rt)
= p(rs(t) + rt + ln
θst
2
< Rt)
= p(θst < 2e−(rs(t)+rt−Rt)
)
=
π
2
e−(rs(t)+rt−Rt)
(48)
9
(c) plug in this probability to the first equation:
¯N(Rt) =
t
1
π
2
e−(rs(t)+rt−Rt)
ds = m
=
2
π
e−(rt−Rt) 1 − e−(1−β)rt
1 − β
= m
(49)
and get Rt:
Rt = rt − ln
2(1 − e−(1−β)rt
)
πm(1 − β)
(50)
(d) then the probability that node s attracts a link from node t is:
p(s, t) = p(xst < Rt)
= m
e−rs(t)
1
1−β
(1 − e−(1−β)rt )
= m
e−rs(t)
t
1
e−ri(t)di
= m
(s
t
)−β
t
1
(i
t
)−βdi
(51)
during step 2 and 3, the author uses a very important transformation:
t
1
e−rs(t)ds
=
t
1
e−βrs−(1−β)rt
ds
= e−(1−β)rt
t
1
elns−β
ds
= e−(1−β)rt
t
1
s−β
ds
=
1
1 − β
(1 − e−(1−β)rt
)
(52)
2. from Model 0, we can deduce that the probability that an existing node s attracts
a link from the new node t is:
p(s, t) = m
q(s, t) + A
(m + A)t
(53)
the degree distribution for a single site is:
q(s, t) = A
s
t
−β
− 1 (54)
the author makes another assumption that the total attractiveness of the network
can be written as:
(m + A)t =
t
1
(q(s, t) + A)ds (55)
10
then 53 can be transformed to:
p(s, t) = m
s
t
−β
t
1
i
t
−β
di
(56)
which can be found equivalent to Model 0’s result.
It is worth noting that β in Model 0 comes from the expectation degree of single
node and equals to 1/(1 + A
m
). And it has been proven that the whole network’s
exponent and single node’s degreee distribution follow the rule:β(γ − 1) = 1. We
recall that this relation is derived based on the probability that existing node s
attracts a link from the m new links introduced by node t.For the PSO model the
basic master equation still holds:
t
s=1
P(q, s, t + 1) =
t
s=1
m
l=0
P(ml)
s P(q − l, s, t + 1) (57)
So the network’s degree distribution is the same as in Model 0.
11
Model 2. Popularity versus Similarity (select randomly m nodes according to Fermi-dirac
distribution)
• reference
Popularity versus Similarity in growing networks
Fragkiskos Papadopoulos, Maksim Kitsak, M. ´Angeles Serrano
Mari´an Bogu˜n´a & Dimitri Krioukov
Nature, 2012
• generative process:
1. initially the network is empty;
2. at time t ≥ 1, new node t appears having coordinates (rt, θt), where rt = lnt,
while θt is uniformly distributed on [0, 2π], and every existing node s, s < t,
moves increasing its radial coordinate according to rs(t) = βrs + (1 − β)rt with
parameter β ∈ [0, 1];
3. node t picks a randomly chosen existing node s, s < t, and connects to it with
probability p(xst) = 1
1+e(xst−Rt)/T . Repeat this procedure until t get m links
• Degree Distribution
we again derive the probability that node s attracts a link from the m new links
introduced by node t:
1. the probability that node t selects randomly a node s, s < t, and connect with it:
p(s, t) =
1
t
1
π
t
1
1
1 + X(s, t)θst
2
1
T
dθst (58)
I think the term 1/t stands for selecting randomly a node from [1 · · · t]. Since the
model assumes that each node appears uniformly on the discrete time interval.
And the integration term stands for the average probability for node t connecting
with node s. Since node t appears with coordinate (rt, θt) within which rt is fixed
and the θt is a random variable. The author sums up all the possible angles and
divides it by the interval [0 · · · π].
The integration is feasible by residue theorem which has the following classic
conclusion:
+∞
0
1
1 + xβ
dx =
π
β
sinπ
β
, for β > 1 (59)
2. Then the probability that node s attracts a link from node t is:
p(s, t) = m
p(s , t)
p(t)
= m
s
t
−β
t
1
i
t
−β
di
(60)
which is equivalent to Model0 and Model1
12
Model 3. Popularity versus Similarity (try to connect with each node according to Fermi-
dirac distribution)
• reference
Popularity versus Similarity in growing networks
Fragkiskos Papadopoulos, Maksim Kitsak, M. ´Angeles Serrano
Mari´an Bogu˜n´a & Dimitri Krioukov
Nature, 2012
• generative process:
1. initially the network is empty;
2. at time t ≥ 1, new node t appears having coordinates (rt, θt), where rt = lnt,
while θt is uniformly distributed on [0, 2π], and every existing node s, s < t,
moves increasing its radial coordinate according to rs(t) = βrs + (1 − β)rt with
parameter β ∈ [0, 1];
3. node t tries to connect with each existing node s (s < t) with probability p(xst) =
1
1+e(xst−Rt)/T .
• Degree Distribution
we again derive the probability that node s attracts a link from the m new links
introduced by node t:
1. Intuitively, the probability that node t connects with node s with probability:
p (s, t) = tp(s, t) (61)
we still have an useful condition that the average number of nodes that node t
connects to is:
¯N(Rt) =
t
1
p (s, t)ds = t
t
1
p (s, t)ds = tp(t) (62)
if we set ¯N(Rt) as m then we get the following relationship:
t =
m
p(t)
(63)
then we can get the equivalent probability formula:
p (s, t) = tp(s, t) =
m
p(t)
p(s, t) = m
s
t
−β
t
1
i
t
−β
di
(64)
which is equivalent to Model0, Model1 and Model2
13
Model 4. Popularity versus Similarity with internal links
• reference
Network Mapping by Replaying Hyperbolic Growing
Fragkiskos Papadopoulos, Constantinos Psomas, and Dmitri Krioukov
IEEE/ACM Transactions on Networking, 2015
• generative process:
1. initially the network is empty;
2. at time t ≥ 1, new node t appears having coordinates (rt, θt), where rt = lnt,
while θt is uniformly distributed on [0, 2π], and every existing node s, s < t,
moves increasing its radial coordinate according to rs(t) = βrs + (1 − β)rt with
parameter β ∈ [0, 1];
3. node t tries to connect with each existing node s (s < t) with probability p(xst) =
1
1+e(xst−Rt)/T .
4. the radius Rt is adjusted to make the expected number of connections that i
establishes is ¯mi(t) = m + ¯Li(t), where ¯Li(t) is the expected number of internal
links between node i and existing nodes j < i by time t.
• Network Mapping
1. ¯Li(t)
we first derive p(i, j, l) which means node i and j establishes an internal link at
time l. The trick is that we need to integrate out the angular random variable
θst and divide it by π to get the averaged connection probability.
p(i, j, l) =
1
l2
1
π
π
0
1
e
ri(l)+rj(l)+ln
θij
2
−Rl /T
dθij
let X(i, j, l) = e(ri(l)+rj(l)−Rl)
=
1
l2
1
π
π
0
1
1 + X(i, j, l)θ
2
1
T
dθ
ref to E.q 59
=
2T
l2sinTπ
·
1
X(i, j, l)
(65)
there is an hidden approximation (trick) here that E.q 59 holds for the +∞ in-
tegration upper bound which is not true here.
Then we derive Π(i, j, l) which stands for the probability that node i and j are
selected randomly and get connected. Since L internal links are introduced when
14
each new node appears:
Π(i, j, l) = L ·
2T
l2sinTπX(i,j,l)
1
2
·
l
0
l
0
2T
l2sinTπX(i,j,l)
didj
= 2L ·
e−(ri(l)+rj(l)−Rl)
l
0
l
0
e−(ri(l)+rj(l)−Rl)didj
= 2L ·
e−(ri(l)+rj(l))
l
0
e−ri(l)di
l
0
e−rj(l)dj
= 2L ·
e−(ri(l)+rj(l))
l
0
e−ri(l)di
2
= 2L ·
(ij)−β
l−2(1−β)
I2
l
where Il =
1
1 − β
· (1 − e−(1−β)rl
)
(66)
then we have all the ingredients for deriving ¯Li(t):
¯Li(t) =
i
1
t
i
Π(i, j, l)dl dj
=
i
1
2L ·
(ij)−β
l−2(1−β)
I2
l
dl dj
assume Il ≈ It for large l
=
2L(1 − β)
(1 − t−(1−β))2(2β − 1)
(
t
i
)2β−1
− 1 (1 − i−(1−β)
)
(67)
2. Node Degree Distribution ¯ki(t)
Recall from Model 1 that the probability that an existing node i attracts a link
from one of m links introduced by node l is:
Π(i, l) = m
e−ri(l)
l
1
e−ri(l)di
(68)
In this Model, m is replaced with mi(t) and during the derivation process the au-
thor makes use of the approximation Il ≈ It for large t again. This approximation
is necessary but not elegant since the result is still very complex. The explicit
formula is omitted here. But the conclusion is as previous Model:
¯ki(t) ∝
i
t
β
¯k = 2(m + L)
(69)
15
3. Node Appearance Time
The author employs two stage process to inference the coordinates of nodes. The
first step is to deduce the node appearance time. Denote the nodes coordinates
likelihood as:
L1 = L(ri(t), θi|αij, Φ) (70)
(a) Using the following Bayes’ Rule:
L({ri(t), θi}|αij, Φ)
L1
· L(αij|Φ)
L3
= P({ri(t), θi}|Φ) · L(αij|{ri(t), θi}, Φ)
L2
(71)
It should be noted that we need only solve {ri(t), θi}, so maximizing L1 is
equivalent to maximizing P({ri(t), θi}|Φ) · L2.
(b) P({ri(t), θi}|Φ)
which stands for the joint probability of nodes’ coordinates. We first derive
node’s radius probability:
p(ri(t) < r) ⇒ βri + (1 − β)rt < r
⇒ i < e− 1
β
r−1−β
β
rt
(72)
since radius coordinate is uniformly distributed:
p(ri(t) < r) =
e− 1
β
r−1−β
β
rt
t
(73)
This is the cumulative distribution, so the probability distribution is the
derivation of p(ri(t) < r) with respect to r:
ft(i) =
1
β
e
1
β
(r−rt)
(74)
Then
P({ri(t), θi}|Φ) =
1
2π
t t
i
ft(i) (75)
(c) L2
L2 =
1≤j<i≤t
˜p(xij(t))αij
{1 − ˜p(xij(t))}1−αij
(76)
(d) maximize L1
the derivative of L1’s logarithm with respect to ri(t):
∂L1
∂ri(t)
=
1
T
− T ·
t
j=1,j=i
αij −
1
1 + e(xij(t)−Rt
(77)
16
set the derivative equals to 0:
˜¯ki(t)
expected degree
= ki
empirical degree
−
T
β (78)
using ¯ki(t)’s mean-field approximation:
˜¯ki(t)
expected degree
≈ ¯ki(t)
mean filed approximation
∝
i∗
t
−β
= ki
empirical degree
−
T
β (79)
then the MLE of i∗
:
i∗
MLE of appearance time
∝ k
− 1
β
i = k
−(γ−1)
i (80)
we can find that the higher the empirical degree of the node the earlier its
MLE appearance time.
4. node’s angular coordinate
maximizing the following likelihood per node by sampling different θ in [0, 2π]:
Li
2 =
1≤j<i
p(xij)αij
[1 − p(xij)]1−αij
(81)
17

More Related Content

What's hot

Phase diagram at finite T & Mu in strong coupling limit of lattice QCD
Phase diagram at finite T & Mu in strong coupling limit of lattice QCDPhase diagram at finite T & Mu in strong coupling limit of lattice QCD
Phase diagram at finite T & Mu in strong coupling limit of lattice QCDBenjamin Jaedon Choi
 
linear transformation and rank nullity theorem
linear transformation and rank nullity theorem linear transformation and rank nullity theorem
linear transformation and rank nullity theorem Manthan Chavda
 
Nodal Domain Theorem for the p-Laplacian on Graphs and the Related Multiway C...
Nodal Domain Theorem for the p-Laplacian on Graphs and the Related Multiway C...Nodal Domain Theorem for the p-Laplacian on Graphs and the Related Multiway C...
Nodal Domain Theorem for the p-Laplacian on Graphs and the Related Multiway C...Francesco Tudisco
 
Chapter3 - Fourier Series Representation of Periodic Signals
Chapter3 - Fourier Series Representation of Periodic SignalsChapter3 - Fourier Series Representation of Periodic Signals
Chapter3 - Fourier Series Representation of Periodic SignalsAttaporn Ninsuwan
 
Common Fixed Point Theorems For Occasionally Weakely Compatible Mappings
Common Fixed Point Theorems For Occasionally Weakely Compatible MappingsCommon Fixed Point Theorems For Occasionally Weakely Compatible Mappings
Common Fixed Point Theorems For Occasionally Weakely Compatible Mappingsiosrjce
 
SLC 2015 talk improved version
SLC 2015 talk improved versionSLC 2015 talk improved version
SLC 2015 talk improved versionZheng Mengdi
 
Stochastic Control and Information Theoretic Dualities (Complete Version)
Stochastic Control and Information Theoretic Dualities (Complete Version)Stochastic Control and Information Theoretic Dualities (Complete Version)
Stochastic Control and Information Theoretic Dualities (Complete Version)Haruki Nishimura
 
Solving the energy problem of helium final report
Solving the energy problem of helium final reportSolving the energy problem of helium final report
Solving the energy problem of helium final reportJamesMa54
 
Topic: Fourier Series ( Periodic Function to change of interval)
Topic: Fourier Series ( Periodic Function to  change of interval)Topic: Fourier Series ( Periodic Function to  change of interval)
Topic: Fourier Series ( Periodic Function to change of interval)Abhishek Choksi
 
R. Jimenez - Fundamental Physics from Astronomical Observations
R. Jimenez - Fundamental Physics from Astronomical ObservationsR. Jimenez - Fundamental Physics from Astronomical Observations
R. Jimenez - Fundamental Physics from Astronomical ObservationsSEENET-MTP
 
22 01 2014_03_23_31_eee_formula_sheet_final
22 01 2014_03_23_31_eee_formula_sheet_final22 01 2014_03_23_31_eee_formula_sheet_final
22 01 2014_03_23_31_eee_formula_sheet_finalvibhuti bansal
 
Matrix of linear transformation
Matrix of linear transformationMatrix of linear transformation
Matrix of linear transformationbeenishbeenish
 
Natalini nse slide_giu2013
Natalini nse slide_giu2013Natalini nse slide_giu2013
Natalini nse slide_giu2013Madd Maths
 
Stochastic Schrödinger equations
Stochastic Schrödinger equationsStochastic Schrödinger equations
Stochastic Schrödinger equationsIlya Gikhman
 

What's hot (20)

Phase diagram at finite T & Mu in strong coupling limit of lattice QCD
Phase diagram at finite T & Mu in strong coupling limit of lattice QCDPhase diagram at finite T & Mu in strong coupling limit of lattice QCD
Phase diagram at finite T & Mu in strong coupling limit of lattice QCD
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
linear transformation and rank nullity theorem
linear transformation and rank nullity theorem linear transformation and rank nullity theorem
linear transformation and rank nullity theorem
 
Nodal Domain Theorem for the p-Laplacian on Graphs and the Related Multiway C...
Nodal Domain Theorem for the p-Laplacian on Graphs and the Related Multiway C...Nodal Domain Theorem for the p-Laplacian on Graphs and the Related Multiway C...
Nodal Domain Theorem for the p-Laplacian on Graphs and the Related Multiway C...
 
Chapter3 - Fourier Series Representation of Periodic Signals
Chapter3 - Fourier Series Representation of Periodic SignalsChapter3 - Fourier Series Representation of Periodic Signals
Chapter3 - Fourier Series Representation of Periodic Signals
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
Common Fixed Point Theorems For Occasionally Weakely Compatible Mappings
Common Fixed Point Theorems For Occasionally Weakely Compatible MappingsCommon Fixed Point Theorems For Occasionally Weakely Compatible Mappings
Common Fixed Point Theorems For Occasionally Weakely Compatible Mappings
 
SLC 2015 talk improved version
SLC 2015 talk improved versionSLC 2015 talk improved version
SLC 2015 talk improved version
 
ENFPC 2010
ENFPC 2010ENFPC 2010
ENFPC 2010
 
Stochastic Control and Information Theoretic Dualities (Complete Version)
Stochastic Control and Information Theoretic Dualities (Complete Version)Stochastic Control and Information Theoretic Dualities (Complete Version)
Stochastic Control and Information Theoretic Dualities (Complete Version)
 
Solving the energy problem of helium final report
Solving the energy problem of helium final reportSolving the energy problem of helium final report
Solving the energy problem of helium final report
 
Recurrences
RecurrencesRecurrences
Recurrences
 
Topic: Fourier Series ( Periodic Function to change of interval)
Topic: Fourier Series ( Periodic Function to  change of interval)Topic: Fourier Series ( Periodic Function to  change of interval)
Topic: Fourier Series ( Periodic Function to change of interval)
 
R. Jimenez - Fundamental Physics from Astronomical Observations
R. Jimenez - Fundamental Physics from Astronomical ObservationsR. Jimenez - Fundamental Physics from Astronomical Observations
R. Jimenez - Fundamental Physics from Astronomical Observations
 
22 01 2014_03_23_31_eee_formula_sheet_final
22 01 2014_03_23_31_eee_formula_sheet_final22 01 2014_03_23_31_eee_formula_sheet_final
22 01 2014_03_23_31_eee_formula_sheet_final
 
Matrix of linear transformation
Matrix of linear transformationMatrix of linear transformation
Matrix of linear transformation
 
MM2020-AV
MM2020-AVMM2020-AV
MM2020-AV
 
Natalini nse slide_giu2013
Natalini nse slide_giu2013Natalini nse slide_giu2013
Natalini nse slide_giu2013
 
Stochastic Schrödinger equations
Stochastic Schrödinger equationsStochastic Schrödinger equations
Stochastic Schrödinger equations
 

Similar to Notes.on.popularity.versus.similarity.model

Ray : modeling dynamic systems
Ray : modeling dynamic systemsRay : modeling dynamic systems
Ray : modeling dynamic systemsHouw Liong The
 
Existence of positive solutions for fractional q-difference equations involvi...
Existence of positive solutions for fractional q-difference equations involvi...Existence of positive solutions for fractional q-difference equations involvi...
Existence of positive solutions for fractional q-difference equations involvi...IJRTEMJOURNAL
 
Comparison Theorems for SDEs
Comparison Theorems for SDEs Comparison Theorems for SDEs
Comparison Theorems for SDEs Ilya Gikhman
 
Numerical Methods
Numerical MethodsNumerical Methods
Numerical MethodsTeja Ande
 
Simplified Runtime Analysis of Estimation of Distribution Algorithms
Simplified Runtime Analysis of Estimation of Distribution AlgorithmsSimplified Runtime Analysis of Estimation of Distribution Algorithms
Simplified Runtime Analysis of Estimation of Distribution AlgorithmsPK Lehre
 
Simplified Runtime Analysis of Estimation of Distribution Algorithms
Simplified Runtime Analysis of Estimation of Distribution AlgorithmsSimplified Runtime Analysis of Estimation of Distribution Algorithms
Simplified Runtime Analysis of Estimation of Distribution AlgorithmsPer Kristian Lehre
 
Csm chapters12
Csm chapters12Csm chapters12
Csm chapters12Pamela Paz
 
research paper publication
research paper publicationresearch paper publication
research paper publicationsamuu45sam
 
Signals and Systems part 2 solutions
Signals and Systems part 2 solutions Signals and Systems part 2 solutions
Signals and Systems part 2 solutions PatrickMumba7
 
Unit 1 Operation on signals
Unit 1  Operation on signalsUnit 1  Operation on signals
Unit 1 Operation on signalsDr.SHANTHI K.G
 
String-Matching Algorithms Advance algorithm
String-Matching  Algorithms Advance algorithmString-Matching  Algorithms Advance algorithm
String-Matching Algorithms Advance algorithmssuseraf60311
 
Scattering theory analogues of several classical estimates in Fourier analysis
Scattering theory analogues of several classical estimates in Fourier analysisScattering theory analogues of several classical estimates in Fourier analysis
Scattering theory analogues of several classical estimates in Fourier analysisVjekoslavKovac1
 
Contemporary communication systems 1st edition mesiya solutions manual
Contemporary communication systems 1st edition mesiya solutions manualContemporary communication systems 1st edition mesiya solutions manual
Contemporary communication systems 1st edition mesiya solutions manualto2001
 
A Note on the Derivation of the Variational Inference Updates for DILN
A Note on the Derivation of the Variational Inference Updates for DILNA Note on the Derivation of the Variational Inference Updates for DILN
A Note on the Derivation of the Variational Inference Updates for DILNTomonari Masada
 

Similar to Notes.on.popularity.versus.similarity.model (20)

Ray : modeling dynamic systems
Ray : modeling dynamic systemsRay : modeling dynamic systems
Ray : modeling dynamic systems
 
002 ray modeling dynamic systems
002 ray modeling dynamic systems002 ray modeling dynamic systems
002 ray modeling dynamic systems
 
002 ray modeling dynamic systems
002 ray modeling dynamic systems002 ray modeling dynamic systems
002 ray modeling dynamic systems
 
Existence of positive solutions for fractional q-difference equations involvi...
Existence of positive solutions for fractional q-difference equations involvi...Existence of positive solutions for fractional q-difference equations involvi...
Existence of positive solutions for fractional q-difference equations involvi...
 
Comparison Theorems for SDEs
Comparison Theorems for SDEs Comparison Theorems for SDEs
Comparison Theorems for SDEs
 
stochastic processes assignment help
stochastic processes assignment helpstochastic processes assignment help
stochastic processes assignment help
 
lecture6.ppt
lecture6.pptlecture6.ppt
lecture6.ppt
 
Numerical Methods
Numerical MethodsNumerical Methods
Numerical Methods
 
Simplified Runtime Analysis of Estimation of Distribution Algorithms
Simplified Runtime Analysis of Estimation of Distribution AlgorithmsSimplified Runtime Analysis of Estimation of Distribution Algorithms
Simplified Runtime Analysis of Estimation of Distribution Algorithms
 
Simplified Runtime Analysis of Estimation of Distribution Algorithms
Simplified Runtime Analysis of Estimation of Distribution AlgorithmsSimplified Runtime Analysis of Estimation of Distribution Algorithms
Simplified Runtime Analysis of Estimation of Distribution Algorithms
 
convulution
convulutionconvulution
convulution
 
Csm chapters12
Csm chapters12Csm chapters12
Csm chapters12
 
Ma2002 1.14 rm
Ma2002 1.14 rmMa2002 1.14 rm
Ma2002 1.14 rm
 
research paper publication
research paper publicationresearch paper publication
research paper publication
 
Signals and Systems part 2 solutions
Signals and Systems part 2 solutions Signals and Systems part 2 solutions
Signals and Systems part 2 solutions
 
Unit 1 Operation on signals
Unit 1  Operation on signalsUnit 1  Operation on signals
Unit 1 Operation on signals
 
String-Matching Algorithms Advance algorithm
String-Matching  Algorithms Advance algorithmString-Matching  Algorithms Advance algorithm
String-Matching Algorithms Advance algorithm
 
Scattering theory analogues of several classical estimates in Fourier analysis
Scattering theory analogues of several classical estimates in Fourier analysisScattering theory analogues of several classical estimates in Fourier analysis
Scattering theory analogues of several classical estimates in Fourier analysis
 
Contemporary communication systems 1st edition mesiya solutions manual
Contemporary communication systems 1st edition mesiya solutions manualContemporary communication systems 1st edition mesiya solutions manual
Contemporary communication systems 1st edition mesiya solutions manual
 
A Note on the Derivation of the Variational Inference Updates for DILN
A Note on the Derivation of the Variational Inference Updates for DILNA Note on the Derivation of the Variational Inference Updates for DILN
A Note on the Derivation of the Variational Inference Updates for DILN
 

More from sun peiyuan

network mining and representation learning
network mining and representation learningnetwork mining and representation learning
network mining and representation learningsun peiyuan
 
基于Gpu的高性能计算
基于Gpu的高性能计算基于Gpu的高性能计算
基于Gpu的高性能计算sun peiyuan
 
A geometric interpretation for growing networks
A geometric interpretation for growing networksA geometric interpretation for growing networks
A geometric interpretation for growing networkssun peiyuan
 
Variational inference
Variational inferenceVariational inference
Variational inferencesun peiyuan
 

More from sun peiyuan (8)

network mining and representation learning
network mining and representation learningnetwork mining and representation learning
network mining and representation learning
 
基于Gpu的高性能计算
基于Gpu的高性能计算基于Gpu的高性能计算
基于Gpu的高性能计算
 
A geometric interpretation for growing networks
A geometric interpretation for growing networksA geometric interpretation for growing networks
A geometric interpretation for growing networks
 
Dsgld
DsgldDsgld
Dsgld
 
Variational inference
Variational inferenceVariational inference
Variational inference
 
Lda
LdaLda
Lda
 
Manifold
ManifoldManifold
Manifold
 
HMC
HMCHMC
HMC
 

Recently uploaded

All-domain Anomaly Resolution Office U.S. Department of Defense (U) Case: “Eg...
All-domain Anomaly Resolution Office U.S. Department of Defense (U) Case: “Eg...All-domain Anomaly Resolution Office U.S. Department of Defense (U) Case: “Eg...
All-domain Anomaly Resolution Office U.S. Department of Defense (U) Case: “Eg...Sérgio Sacani
 
Spermiogenesis or Spermateleosis or metamorphosis of spermatid
Spermiogenesis or Spermateleosis or metamorphosis of spermatidSpermiogenesis or Spermateleosis or metamorphosis of spermatid
Spermiogenesis or Spermateleosis or metamorphosis of spermatidSarthak Sekhar Mondal
 
GFP in rDNA Technology (Biotechnology).pptx
GFP in rDNA Technology (Biotechnology).pptxGFP in rDNA Technology (Biotechnology).pptx
GFP in rDNA Technology (Biotechnology).pptxAleenaTreesaSaji
 
Call Girls in Munirka Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Munirka Delhi 💯Call Us 🔝8264348440🔝Call Girls in Munirka Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Munirka Delhi 💯Call Us 🔝8264348440🔝soniya singh
 
Analytical Profile of Coleus Forskohlii | Forskolin .pdf
Analytical Profile of Coleus Forskohlii | Forskolin .pdfAnalytical Profile of Coleus Forskohlii | Forskolin .pdf
Analytical Profile of Coleus Forskohlii | Forskolin .pdfSwapnil Therkar
 
Hubble Asteroid Hunter III. Physical properties of newly found asteroids
Hubble Asteroid Hunter III. Physical properties of newly found asteroidsHubble Asteroid Hunter III. Physical properties of newly found asteroids
Hubble Asteroid Hunter III. Physical properties of newly found asteroidsSérgio Sacani
 
Orientation, design and principles of polyhouse
Orientation, design and principles of polyhouseOrientation, design and principles of polyhouse
Orientation, design and principles of polyhousejana861314
 
Is RISC-V ready for HPC workload? Maybe?
Is RISC-V ready for HPC workload? Maybe?Is RISC-V ready for HPC workload? Maybe?
Is RISC-V ready for HPC workload? Maybe?Patrick Diehl
 
Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...
Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...
Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...jana861314
 
A relative description on Sonoporation.pdf
A relative description on Sonoporation.pdfA relative description on Sonoporation.pdf
A relative description on Sonoporation.pdfnehabiju2046
 
Nightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43b
Nightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43bNightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43b
Nightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43bSérgio Sacani
 
Biopesticide (2).pptx .This slides helps to know the different types of biop...
Biopesticide (2).pptx  .This slides helps to know the different types of biop...Biopesticide (2).pptx  .This slides helps to know the different types of biop...
Biopesticide (2).pptx .This slides helps to know the different types of biop...RohitNehra6
 
Nanoparticles synthesis and characterization​ ​
Nanoparticles synthesis and characterization​  ​Nanoparticles synthesis and characterization​  ​
Nanoparticles synthesis and characterization​ ​kaibalyasahoo82800
 
Disentangling the origin of chemical differences using GHOST
Disentangling the origin of chemical differences using GHOSTDisentangling the origin of chemical differences using GHOST
Disentangling the origin of chemical differences using GHOSTSérgio Sacani
 
Luciferase in rDNA technology (biotechnology).pptx
Luciferase in rDNA technology (biotechnology).pptxLuciferase in rDNA technology (biotechnology).pptx
Luciferase in rDNA technology (biotechnology).pptxAleenaTreesaSaji
 
Boyles law module in the grade 10 science
Boyles law module in the grade 10 scienceBoyles law module in the grade 10 science
Boyles law module in the grade 10 sciencefloriejanemacaya1
 
Grafana in space: Monitoring Japan's SLIM moon lander in real time
Grafana in space: Monitoring Japan's SLIM moon lander  in real timeGrafana in space: Monitoring Japan's SLIM moon lander  in real time
Grafana in space: Monitoring Japan's SLIM moon lander in real timeSatoshi NAKAHIRA
 
G9 Science Q4- Week 1-2 Projectile Motion.ppt
G9 Science Q4- Week 1-2 Projectile Motion.pptG9 Science Q4- Week 1-2 Projectile Motion.ppt
G9 Science Q4- Week 1-2 Projectile Motion.pptMAESTRELLAMesa2
 
Call Girls in Mayapuri Delhi 💯Call Us 🔝9953322196🔝 💯Escort.
Call Girls in Mayapuri Delhi 💯Call Us 🔝9953322196🔝 💯Escort.Call Girls in Mayapuri Delhi 💯Call Us 🔝9953322196🔝 💯Escort.
Call Girls in Mayapuri Delhi 💯Call Us 🔝9953322196🔝 💯Escort.aasikanpl
 

Recently uploaded (20)

All-domain Anomaly Resolution Office U.S. Department of Defense (U) Case: “Eg...
All-domain Anomaly Resolution Office U.S. Department of Defense (U) Case: “Eg...All-domain Anomaly Resolution Office U.S. Department of Defense (U) Case: “Eg...
All-domain Anomaly Resolution Office U.S. Department of Defense (U) Case: “Eg...
 
Spermiogenesis or Spermateleosis or metamorphosis of spermatid
Spermiogenesis or Spermateleosis or metamorphosis of spermatidSpermiogenesis or Spermateleosis or metamorphosis of spermatid
Spermiogenesis or Spermateleosis or metamorphosis of spermatid
 
GFP in rDNA Technology (Biotechnology).pptx
GFP in rDNA Technology (Biotechnology).pptxGFP in rDNA Technology (Biotechnology).pptx
GFP in rDNA Technology (Biotechnology).pptx
 
Call Girls in Munirka Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Munirka Delhi 💯Call Us 🔝8264348440🔝Call Girls in Munirka Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Munirka Delhi 💯Call Us 🔝8264348440🔝
 
9953056974 Young Call Girls In Mahavir enclave Indian Quality Escort service
9953056974 Young Call Girls In Mahavir enclave Indian Quality Escort service9953056974 Young Call Girls In Mahavir enclave Indian Quality Escort service
9953056974 Young Call Girls In Mahavir enclave Indian Quality Escort service
 
Analytical Profile of Coleus Forskohlii | Forskolin .pdf
Analytical Profile of Coleus Forskohlii | Forskolin .pdfAnalytical Profile of Coleus Forskohlii | Forskolin .pdf
Analytical Profile of Coleus Forskohlii | Forskolin .pdf
 
Hubble Asteroid Hunter III. Physical properties of newly found asteroids
Hubble Asteroid Hunter III. Physical properties of newly found asteroidsHubble Asteroid Hunter III. Physical properties of newly found asteroids
Hubble Asteroid Hunter III. Physical properties of newly found asteroids
 
Orientation, design and principles of polyhouse
Orientation, design and principles of polyhouseOrientation, design and principles of polyhouse
Orientation, design and principles of polyhouse
 
Is RISC-V ready for HPC workload? Maybe?
Is RISC-V ready for HPC workload? Maybe?Is RISC-V ready for HPC workload? Maybe?
Is RISC-V ready for HPC workload? Maybe?
 
Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...
Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...
Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...
 
A relative description on Sonoporation.pdf
A relative description on Sonoporation.pdfA relative description on Sonoporation.pdf
A relative description on Sonoporation.pdf
 
Nightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43b
Nightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43bNightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43b
Nightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43b
 
Biopesticide (2).pptx .This slides helps to know the different types of biop...
Biopesticide (2).pptx  .This slides helps to know the different types of biop...Biopesticide (2).pptx  .This slides helps to know the different types of biop...
Biopesticide (2).pptx .This slides helps to know the different types of biop...
 
Nanoparticles synthesis and characterization​ ​
Nanoparticles synthesis and characterization​  ​Nanoparticles synthesis and characterization​  ​
Nanoparticles synthesis and characterization​ ​
 
Disentangling the origin of chemical differences using GHOST
Disentangling the origin of chemical differences using GHOSTDisentangling the origin of chemical differences using GHOST
Disentangling the origin of chemical differences using GHOST
 
Luciferase in rDNA technology (biotechnology).pptx
Luciferase in rDNA technology (biotechnology).pptxLuciferase in rDNA technology (biotechnology).pptx
Luciferase in rDNA technology (biotechnology).pptx
 
Boyles law module in the grade 10 science
Boyles law module in the grade 10 scienceBoyles law module in the grade 10 science
Boyles law module in the grade 10 science
 
Grafana in space: Monitoring Japan's SLIM moon lander in real time
Grafana in space: Monitoring Japan's SLIM moon lander  in real timeGrafana in space: Monitoring Japan's SLIM moon lander  in real time
Grafana in space: Monitoring Japan's SLIM moon lander in real time
 
G9 Science Q4- Week 1-2 Projectile Motion.ppt
G9 Science Q4- Week 1-2 Projectile Motion.pptG9 Science Q4- Week 1-2 Projectile Motion.ppt
G9 Science Q4- Week 1-2 Projectile Motion.ppt
 
Call Girls in Mayapuri Delhi 💯Call Us 🔝9953322196🔝 💯Escort.
Call Girls in Mayapuri Delhi 💯Call Us 🔝9953322196🔝 💯Escort.Call Girls in Mayapuri Delhi 💯Call Us 🔝9953322196🔝 💯Escort.
Call Girls in Mayapuri Delhi 💯Call Us 🔝9953322196🔝 💯Escort.
 

Notes.on.popularity.versus.similarity.model

  • 1. Notes on Popularity versus Similarity Network Model Peiyuan Sun April 13, 2017 Model 0. PA with initial attractiveness • reference Structure of Growing Networks with Preferential Linking S.N.Dorogovtsev, J.F.F.Mendes, and A.N.Samukhin Physical Review Letters, 2000 • generative process: 1. at equally discrete time step, a new site appears, 2. simultaneously, m new directed links coming out from nonspecified sites are in- troduced, 3. the m new links are distributed proportional to the attractiveness of node s: As = A + qs, where A is each node’s initial attractiveness and qs incoming degree of node s. • Master Equation P(q, s, t + 1) = m l=0 P(ml) s P(q, s, t + 1) = m l=0 m l q − l + am (1 + a)mt l 1 − q − l + am (1 + a)mt m−l p(q − l, s, t) (1) P(q, s, s) = δ(q) (2) connectivity distribution of the entire networks: P(q, t) = t u=1 P(q, s, t)/t (3) 1
  • 2. summing up E.q 1 over s from 1 to t t s=1 P(q, s, t + 1) = t s=1 m l=0 P(ml) s P(q − l, s, t + 1) = t s=1 m l=0 m l q − l + am (1 + a)mt l 1 − q − l + am (1 + a)mt m−l P(q − l, s, t) = m l=0 m l q − l + am (1 + a)mt l 1 − q − l + am (1 + a)mt m−l t s=1 P(q − l, s, t) = m 0 1 − q − l + am (1 + a)mt m tP(q, t) + m 1 q − l + am (1 + a)mt 1 − q − l + am (1 + a)mt m−1 tP(q − 1, t) = t 1 − q − l + am (1 + a)mt m P(q, t) + mt q − l + am (1 + a)mt 1 − q − l + am (1 + a)mt m−1 P(q − 1, t) = t − q − l + am 1 + a P(q, t) + q − l + am 1 + a P(q − 1, t) + O( P t ) (4) t s=1 P(q, s, t + 1) = t+1 s=1 P(q, s, t + 1) − P(q, t + 1, t + 1) = (t + 1)P(q, t + 1) − P(q, t + 1, t + 1) (5) (t + 1)P(q, t + 1) − P(q, t + 1, t + 1) = t − q − l + am 1 + a P(q, t) + q − l + am 1 + a P(q − 1, t) + O( P t ) (6) (1 + a)t (P(q, t + 1) − P(q, t)) + (1 + a)P(q, t + 1) + (q + am)P(q, t) − (q − 1 + am)P(q − 1, t) = (1 + a)δ(q) (7) at long times t 1: (1 + a)t ∂P(q, t) ∂t + (1 + a)P(q, t) + (q + am)P(q, t) − (q − 1 + am)P(q − 1, t) = (1 + a)δ(q) (8) assume the limit P(q) = P(q, t → ∞) exists: (1 + a)P(q) + (q + am)P(q) − (q − 1 + am)P(q − 1) = (1 + a)δ(q) (9) 2
  • 3. assume the generating function: Φ(z) = ∞ q=0 P(q)zq (10) we first derive some trivial cases for the generating function: Z[P(q)] = ∞ q=0 P(q)zq = Φ(z) Z[qP(q)] = ∞ q=0 qP(q)zq = z ∞ q=0 qP(q)zq−1 = z dΦ dz Z[qP(q − 1)] = ∞ q=1 qP(q − 1)zq = z2 ∞ q=1 (q − 1)P(q − 1)zq−2 + z ∞ q=1 P(q − 1)zq−1 = z2 ∞ q =0 q P(q )zq −1 + z ∞ q =0 P(q )zq = z2 dΦ dz + zΦ Z[P(q − 1)] = z ∞ q=1 P(q − 1)zq−1 = z ∞ q =0 P(q )zq = zΦ (11) then E.q 9 transforms to the following differential equation: (1 + a)Φ + z dΦ dz + maΦ − z2 dΦ dz − zΦ − (ma − 1)zΦ = 1 + a (12) after some simple algebra operations: z(1 − z) dΦ dz + ma(1 − z)Φ + (1 + a)Φ = 1 + a (13) this is a first order nonhomogeneous linear differential equation with following formula: dy dx + P(x)y = Q(x) (14) and the general solution to this equation has the following forms: y = Ce− P(x)dx + e− P(x)dx Q(x)e− P(x)dx dx (15) 3
  • 4. comparing E.q 12 with E.q 14: P(z) = ma(1 − z) + 1 + a z(1 − z) Q(z) = 1 + a z(1 − z) (16) do some simple integration operations: P(z)dz = [(m + 1)a + 1] lnz − (1 + a)ln(1 − z) Q(z)e P(z)dz dz = (1 + a) z 0 x(m+1)a (1 − x)a+2 dx e− P(z)dz = (1 − z)1+a z(m+1)a+1 (17) then we get the general formula of Φ: Φ = Cz−1−(m+1)a (1 − z)1+a + (1 + a)z−1−(m+1)a (1 − z)1+a z 0 dx x(m+1)a (1 − x)2+a (18) from E.q 17, we can get that the constant term C = 0, then: Φ = (1 + a)z−1−(m+1)a (1 − z)1+a z 0 dx x(m+1)a (1 − x)2+a = 1 + a 1 + (1 + m)a2 F1[1, ma; 2 + (m + 1)a; z] (19) using hypergeometric function’s expansion in z and comparing with the z’s generating function: P(q) = (1 + a) Γ [(m + 1)a + 1] Γ(ma) Γ(q + ma) q + 2 + (m + 1)a (20) E.q 20 is the 1st main result which is the analytic solution of the degree distribution of the growing network. We then discuss 2 special cases: 1. when a = 1, which corresponding to the case As = m + qs P(q) = 2m(m + 1) (q + m)(q + m + 1)(q + m + 2) (21) 2. when ma + q 1 P(q) ≈ (1 + a) Γ [(m + 1)a + 1] Γ(ma) (q + ma)−(2+a) = (1 + a) Γ [(m + 1)a + 1] Γ(ma) (q + ma)−γ (22) with γ = 2 + a = 2 + A m 4
  • 5. then let us derive the distribution of P(q, s, t), for t 1 we expand the Master Equation to 2nd order: P(q, s, t + 1) = 1 − q + am (1 + a)t P(q, s, t) + q − 1 + am (1 + a)t P(q − 1, s, t) + O( p t2 ) (23) (1 + a)tP(q, s, t + 1) = [(1 + a)t − (q + am)] P(q, s, t) + (q − 1 + am)P(q − 1, s, t) + O( p t ) (24) (1 + a)tP(q, s, t + 1) − (1 + a)tP(q, s, t) = (q − 1 + am)P(q − 1, s, t) − (q + am)P(q, s, t) (25) replace the difference with derivative for large t: (1 + a)t ∂p ∂t (q, s, t) = (q − 1 + am)P(q − 1, s, t) − (q + am)P(q, s, t) (26) define the generating function G(x, t) by means of: G(x, t) = +∞ q=0 P(q, s, t)xq (27) 5
  • 6. then we begin by taking the time derivative: ∂G(x, t) ∂t = +∞ q=0 ∂P(q, s, t) ∂t xq = +∞ q=0 [ q (1 + a)t P(q − 1, s, t) + am − 1 (1 + a)t P(q − 1, s, t) − q (1 + a)t P(q, s, t) − am (1 + a)t P(q, s, t)]xq = 1 (1 + a)t +∞ q=1 qP(q − 1, s, t)xq + am − 1 (1 + a)t +∞ q=1 P(q − 1, s, t)xq − 1 (1 + a)t +∞ q=0 qP(q, s, t)xq − am (1 + a)t +∞ q=0 P(q, s, t)xq = 1 (1 + a)t +∞ q=1 (q − 1)P(q − 1, s, t)xq + 1 (1 + a)t +∞ q=1 P(q − 1, s, t)xq + (am − 1)x (1 + a)t +∞ q=1 P(q − 1, s, t)xq−1 − x (1 + a)t +∞ q=0 qP(q, s, t)xq−1 − am (1 + a)t +∞ q=0 P(q, s, t)xq = 1 (1 + a)t +∞ q=0 qP(q, s, t)xq+1 + 1 (1 + a)t +∞ q=0 P(q, s, t)xq+1 + (am − 1)x (1 + a)t +∞ q=0 P(q, s, t)xq − x (1 + a)t +∞ q=0 qP(q, s, t)xq−1 − am (1 + a)t +∞ q=0 P(q, s, t)xq = x2 (1 + a)t +∞ q=0 qP(q, s, t)xq−1 + x (1 + a)t +∞ q=0 P(q, s, t)xq + (am − 1)x (1 + a)t +∞ q=0 P(q, s, t)xq − x (1 + a)t +∞ q=0 qP(q, s, t)xq−1 − am (1 + a)t +∞ q=0 P(q, s, t)xq = x2 (1 + a)t ∂G(x, t) ∂x + x (1 + a)t G(x, t) + (am − 1)x (1 + a)t G(x, t) − x (1 + a)t ∂G(x, t) ∂x − am (1 + a)t G(x, t) = x(x − 1) (1 + a)t ∂G(x, t) ∂x + am(x − 1) (1 + a)t G(x, t) (28) 6
  • 7. then we get the partial differential equation for the generating function G(x, t): ∂G(x, t) ∂t = x(x − 1) (1 + a)t ∂G(x, t) ∂x + am(x − 1) (1 + a)t G(x, t) (29) with initial condition: G(x, s) = +∞ q=0 P(q, s, s)xq = +∞ q=0 δ(q)xq = 1 (30) To get the analytic solution of E.q 29, we resort to the method of characteristics. By the following characteristic equation: dx −x(x−1) (1+a)t = dt 1 = dG am(x−1) (1+a)t G(x, t) (31) which is equivalent to: dx x(x − 1) = dt (1 + a)t = dG am(x − 1)G (32) we consider the following two pairs: dx x(x − 1) = dt (1 + a)t (33) dx x(x − 1) = dG am(x − 1)G (34) do integration on both sides of E.q 33 and E.q 34: x x − 1 t− 1 1+a = C1 (35) G(x, t) = 1 x C2 (36) we could view C2 as a function of C1, since C1 = x x−1 t− 1 1+a , we can write the general solution of the differential equation as: G(x, t) = 1 f( x x−1 t− 1 1+a ) 1 x (37) plug in the initial condition: f( x x − 1 t− 1 1+a ) = 1 x (38) replace x x−1 t− 1 1+a with r: f(r) = r − s− 1 1+a r (39) 7
  • 8. insert f( x x−1 t− 1 1+a ) into 37, we get the analytic solution of the partial differential equa- tion E.q 29: G(x, t) = 1 − s t − 1 1+a x + s t − 1 1+a −am (40) to get the analytic result of P(q, s, t), we use Taylor expansion of E.q 40. The coefficient of xq is: ∂G(n) (x = 0, t) ∂x 1 q! = (−am)(−am − 1) · · · (−am − q + 1) q! 1 − s t − 1 1+a q s t amq 1+a = (−1)q am(am + 1) · · · (am + q − 1) q! (−1)q 1 − s t 1 1+a q s t am 1+a = am(am + 1) · · · (am + q − 1)) q! 1 − s t 1 1+a q s t am 1+a = Γ(am + q) Γ(am)q! 1 − s t 1 1+a q s t am 1+a (41) then: P(q, s, t) = Γ(am + q) Γ(am)q! 1 − s t 1 1+a q s t am 1+a (42) the average connectivity of a given site is: ∂G(x = 1, t) ∂x = +∞ q=0 qP(q, s, t)xq−1 x=1 = am s t − 1 1+a − 1 (43) ¯q(s, t) = am s t − 1 1+a − 1 (44) let β = 1/(1 + a) and comparing with γ = 2 + a, we get the relation between single site’s age and the whole network: β(γ − 1) = 1 (45) 8
  • 9. Model 1. Popularity versus Similarity (m closest nodes) • reference Popularity versus Similarity in growing networks Fragkiskos Papadopoulos, Maksim Kitsak, M. ´Angeles Serrano Mari´an Bogu˜n´a & Dimitri Krioukov Nature, 2012 • generative process: 1. initially the network is empty; 2. at time t ≥ 1, new node t appears having coordinates (rt, θt), where rt = lnt, while θt is uniformly distributed on [0, 2π], and every existing node s, s < t, moves increasing its radial coordinate according to rs(t) = βrs + (1 − β)rt with parameter β ∈ [0, 1]; 3. node t connects to the m hyperbolically closest nodes s, s < t; at early times t ≤ m, node t connects to all the existing nodes. The hyperbolic distance between two points (rs, θs) and (rt, θt) is given by: xst = 1 2 arccosh(cosh2rscosh2rt − sinh2rssinh2rtcosΘst) ≈ rs + rt + ln(θst/2), whereθst = π − |π − |θs − θt|| (46) • Degree Distribution we first derive the probability that node s attracts a link from the m new links intro- duced by node t: 1. Since the generative process assume that each new node connects to the hyper- bolically m closest existing nodes. The only requirement that need to meet is that the hyperbolic distance xst is less than the largest hyperbolic distance between node t and its m closest neighbors. We could image this area as a disc centered at node t. Then: (a) the radius of this disc Rt must satisfy the following condition: ¯N(Rt) = t 1 p(s, t)ds = m (47) (b) the connection probability between node s and t: p(s, t) = p(xst < Rt) = p(rs(t) + rt + ln θst 2 < Rt) = p(θst < 2e−(rs(t)+rt−Rt) ) = π 2 e−(rs(t)+rt−Rt) (48) 9
  • 10. (c) plug in this probability to the first equation: ¯N(Rt) = t 1 π 2 e−(rs(t)+rt−Rt) ds = m = 2 π e−(rt−Rt) 1 − e−(1−β)rt 1 − β = m (49) and get Rt: Rt = rt − ln 2(1 − e−(1−β)rt ) πm(1 − β) (50) (d) then the probability that node s attracts a link from node t is: p(s, t) = p(xst < Rt) = m e−rs(t) 1 1−β (1 − e−(1−β)rt ) = m e−rs(t) t 1 e−ri(t)di = m (s t )−β t 1 (i t )−βdi (51) during step 2 and 3, the author uses a very important transformation: t 1 e−rs(t)ds = t 1 e−βrs−(1−β)rt ds = e−(1−β)rt t 1 elns−β ds = e−(1−β)rt t 1 s−β ds = 1 1 − β (1 − e−(1−β)rt ) (52) 2. from Model 0, we can deduce that the probability that an existing node s attracts a link from the new node t is: p(s, t) = m q(s, t) + A (m + A)t (53) the degree distribution for a single site is: q(s, t) = A s t −β − 1 (54) the author makes another assumption that the total attractiveness of the network can be written as: (m + A)t = t 1 (q(s, t) + A)ds (55) 10
  • 11. then 53 can be transformed to: p(s, t) = m s t −β t 1 i t −β di (56) which can be found equivalent to Model 0’s result. It is worth noting that β in Model 0 comes from the expectation degree of single node and equals to 1/(1 + A m ). And it has been proven that the whole network’s exponent and single node’s degreee distribution follow the rule:β(γ − 1) = 1. We recall that this relation is derived based on the probability that existing node s attracts a link from the m new links introduced by node t.For the PSO model the basic master equation still holds: t s=1 P(q, s, t + 1) = t s=1 m l=0 P(ml) s P(q − l, s, t + 1) (57) So the network’s degree distribution is the same as in Model 0. 11
  • 12. Model 2. Popularity versus Similarity (select randomly m nodes according to Fermi-dirac distribution) • reference Popularity versus Similarity in growing networks Fragkiskos Papadopoulos, Maksim Kitsak, M. ´Angeles Serrano Mari´an Bogu˜n´a & Dimitri Krioukov Nature, 2012 • generative process: 1. initially the network is empty; 2. at time t ≥ 1, new node t appears having coordinates (rt, θt), where rt = lnt, while θt is uniformly distributed on [0, 2π], and every existing node s, s < t, moves increasing its radial coordinate according to rs(t) = βrs + (1 − β)rt with parameter β ∈ [0, 1]; 3. node t picks a randomly chosen existing node s, s < t, and connects to it with probability p(xst) = 1 1+e(xst−Rt)/T . Repeat this procedure until t get m links • Degree Distribution we again derive the probability that node s attracts a link from the m new links introduced by node t: 1. the probability that node t selects randomly a node s, s < t, and connect with it: p(s, t) = 1 t 1 π t 1 1 1 + X(s, t)θst 2 1 T dθst (58) I think the term 1/t stands for selecting randomly a node from [1 · · · t]. Since the model assumes that each node appears uniformly on the discrete time interval. And the integration term stands for the average probability for node t connecting with node s. Since node t appears with coordinate (rt, θt) within which rt is fixed and the θt is a random variable. The author sums up all the possible angles and divides it by the interval [0 · · · π]. The integration is feasible by residue theorem which has the following classic conclusion: +∞ 0 1 1 + xβ dx = π β sinπ β , for β > 1 (59) 2. Then the probability that node s attracts a link from node t is: p(s, t) = m p(s , t) p(t) = m s t −β t 1 i t −β di (60) which is equivalent to Model0 and Model1 12
  • 13. Model 3. Popularity versus Similarity (try to connect with each node according to Fermi- dirac distribution) • reference Popularity versus Similarity in growing networks Fragkiskos Papadopoulos, Maksim Kitsak, M. ´Angeles Serrano Mari´an Bogu˜n´a & Dimitri Krioukov Nature, 2012 • generative process: 1. initially the network is empty; 2. at time t ≥ 1, new node t appears having coordinates (rt, θt), where rt = lnt, while θt is uniformly distributed on [0, 2π], and every existing node s, s < t, moves increasing its radial coordinate according to rs(t) = βrs + (1 − β)rt with parameter β ∈ [0, 1]; 3. node t tries to connect with each existing node s (s < t) with probability p(xst) = 1 1+e(xst−Rt)/T . • Degree Distribution we again derive the probability that node s attracts a link from the m new links introduced by node t: 1. Intuitively, the probability that node t connects with node s with probability: p (s, t) = tp(s, t) (61) we still have an useful condition that the average number of nodes that node t connects to is: ¯N(Rt) = t 1 p (s, t)ds = t t 1 p (s, t)ds = tp(t) (62) if we set ¯N(Rt) as m then we get the following relationship: t = m p(t) (63) then we can get the equivalent probability formula: p (s, t) = tp(s, t) = m p(t) p(s, t) = m s t −β t 1 i t −β di (64) which is equivalent to Model0, Model1 and Model2 13
  • 14. Model 4. Popularity versus Similarity with internal links • reference Network Mapping by Replaying Hyperbolic Growing Fragkiskos Papadopoulos, Constantinos Psomas, and Dmitri Krioukov IEEE/ACM Transactions on Networking, 2015 • generative process: 1. initially the network is empty; 2. at time t ≥ 1, new node t appears having coordinates (rt, θt), where rt = lnt, while θt is uniformly distributed on [0, 2π], and every existing node s, s < t, moves increasing its radial coordinate according to rs(t) = βrs + (1 − β)rt with parameter β ∈ [0, 1]; 3. node t tries to connect with each existing node s (s < t) with probability p(xst) = 1 1+e(xst−Rt)/T . 4. the radius Rt is adjusted to make the expected number of connections that i establishes is ¯mi(t) = m + ¯Li(t), where ¯Li(t) is the expected number of internal links between node i and existing nodes j < i by time t. • Network Mapping 1. ¯Li(t) we first derive p(i, j, l) which means node i and j establishes an internal link at time l. The trick is that we need to integrate out the angular random variable θst and divide it by π to get the averaged connection probability. p(i, j, l) = 1 l2 1 π π 0 1 e ri(l)+rj(l)+ln θij 2 −Rl /T dθij let X(i, j, l) = e(ri(l)+rj(l)−Rl) = 1 l2 1 π π 0 1 1 + X(i, j, l)θ 2 1 T dθ ref to E.q 59 = 2T l2sinTπ · 1 X(i, j, l) (65) there is an hidden approximation (trick) here that E.q 59 holds for the +∞ in- tegration upper bound which is not true here. Then we derive Π(i, j, l) which stands for the probability that node i and j are selected randomly and get connected. Since L internal links are introduced when 14
  • 15. each new node appears: Π(i, j, l) = L · 2T l2sinTπX(i,j,l) 1 2 · l 0 l 0 2T l2sinTπX(i,j,l) didj = 2L · e−(ri(l)+rj(l)−Rl) l 0 l 0 e−(ri(l)+rj(l)−Rl)didj = 2L · e−(ri(l)+rj(l)) l 0 e−ri(l)di l 0 e−rj(l)dj = 2L · e−(ri(l)+rj(l)) l 0 e−ri(l)di 2 = 2L · (ij)−β l−2(1−β) I2 l where Il = 1 1 − β · (1 − e−(1−β)rl ) (66) then we have all the ingredients for deriving ¯Li(t): ¯Li(t) = i 1 t i Π(i, j, l)dl dj = i 1 2L · (ij)−β l−2(1−β) I2 l dl dj assume Il ≈ It for large l = 2L(1 − β) (1 − t−(1−β))2(2β − 1) ( t i )2β−1 − 1 (1 − i−(1−β) ) (67) 2. Node Degree Distribution ¯ki(t) Recall from Model 1 that the probability that an existing node i attracts a link from one of m links introduced by node l is: Π(i, l) = m e−ri(l) l 1 e−ri(l)di (68) In this Model, m is replaced with mi(t) and during the derivation process the au- thor makes use of the approximation Il ≈ It for large t again. This approximation is necessary but not elegant since the result is still very complex. The explicit formula is omitted here. But the conclusion is as previous Model: ¯ki(t) ∝ i t β ¯k = 2(m + L) (69) 15
  • 16. 3. Node Appearance Time The author employs two stage process to inference the coordinates of nodes. The first step is to deduce the node appearance time. Denote the nodes coordinates likelihood as: L1 = L(ri(t), θi|αij, Φ) (70) (a) Using the following Bayes’ Rule: L({ri(t), θi}|αij, Φ) L1 · L(αij|Φ) L3 = P({ri(t), θi}|Φ) · L(αij|{ri(t), θi}, Φ) L2 (71) It should be noted that we need only solve {ri(t), θi}, so maximizing L1 is equivalent to maximizing P({ri(t), θi}|Φ) · L2. (b) P({ri(t), θi}|Φ) which stands for the joint probability of nodes’ coordinates. We first derive node’s radius probability: p(ri(t) < r) ⇒ βri + (1 − β)rt < r ⇒ i < e− 1 β r−1−β β rt (72) since radius coordinate is uniformly distributed: p(ri(t) < r) = e− 1 β r−1−β β rt t (73) This is the cumulative distribution, so the probability distribution is the derivation of p(ri(t) < r) with respect to r: ft(i) = 1 β e 1 β (r−rt) (74) Then P({ri(t), θi}|Φ) = 1 2π t t i ft(i) (75) (c) L2 L2 = 1≤j<i≤t ˜p(xij(t))αij {1 − ˜p(xij(t))}1−αij (76) (d) maximize L1 the derivative of L1’s logarithm with respect to ri(t): ∂L1 ∂ri(t) = 1 T − T · t j=1,j=i αij − 1 1 + e(xij(t)−Rt (77) 16
  • 17. set the derivative equals to 0: ˜¯ki(t) expected degree = ki empirical degree − T β (78) using ¯ki(t)’s mean-field approximation: ˜¯ki(t) expected degree ≈ ¯ki(t) mean filed approximation ∝ i∗ t −β = ki empirical degree − T β (79) then the MLE of i∗ : i∗ MLE of appearance time ∝ k − 1 β i = k −(γ−1) i (80) we can find that the higher the empirical degree of the node the earlier its MLE appearance time. 4. node’s angular coordinate maximizing the following likelihood per node by sampling different θ in [0, 2π]: Li 2 = 1≤j<i p(xij)αij [1 − p(xij)]1−αij (81) 17