3. ITNRODUCTION
The paper tackles the problem of
network growing, which lies in the
same line with our work
ο Observed undirected graph πΊ = (π, πΈ)
ο Adjacency matrix π¨
Node attributes πΏ β π πΓπ0
ο New nodes ππππ€
with attributes πΏπππ€
The proposed approach learns the generation
of overall adjacency matrix π¨πππ€
for π βͺ ππππ€
New node comes in
With known attributes
4. VARIATIONAL AUTOENCODERS
(VAE)
π₯π ππ(π§|π₯π) π§ π₯π
ππ(π₯π|π§)
π₯π ππ(π§|π₯π)
π§
π₯π
ππ(π₯π|π§)
π(π§)
A well-known deep generative
model
Consist of 2 neural networks:
ο Encoder
ο Decoder
The key idea is to model the
latent variable as a Gaussian
distribution so we can draw a
sample from it.
Loss function for a datapoint π₯π
ππ π, π =
β Eπ§~ππ π§ π₯π
log ππ π₯π π§
7. GENERATIVE GRAPH
CONVOLUTIONAL NETWORK (G-
GCN)
Objective function:
π=1
πβ1
Reconstruction loss (i β th step) + π½
π=1
πβ1
KL (i β th step)
treating incoming nodes as being added one-by-one into the graph
8. EXPERIMENTAL RESULTS
a growing graph is constructed by randomly sampling an observed subgraph containing 70%
of all nodes.
Link prediction performance
9. DISCUSSIONS
The problem is of interest to our research
The paper is not fully written
ο The notations are confusing
ο lack of information on the experimental section
There seems to be an underlying assumption that the growth of
nodes follows a Gaussian distribution
The method may contain flaws