Simple and Deep Graph Convolutional
Networks
Jin-Woo Jeong
Network Science Lab
Dept. of Mathematics
The Catholic University of Korea
E-mail: zeus0208b@catholic.ac.kr
Chen, Ming, et al. "Simple and deep graph convolutional
networks." International conference on machine learning. PMLR, 2020.
2
 Introduction
 GCNII Model
 Experiments
 Conclusion
Q/A
3
Introduction
Motivation: GCNs
• Several GCN models have demonstrated superior performance across various applications.
• However, most of the previous GCN models are shallow. Models like GCN and GAT achieve their best
performance with a 2-layer architecture. Such shallow structures limit their ability to extract information
from high-order neighbors.
~
𝑃=
~
𝐷
−
1
2~
𝐴
~
𝐷
−
1
2
4
Introduction
Motivation: -
• However, Sacking more layers and adding non-linearity tends to degrade their performance. This
phenomenon is called over-smoothing, which suggest that as the number of layers increases, the
representations of nodes in GCN are inclined to converge to a certain value and thus become
indistinguishable.
best
5
Introduction
ResNet
 Residual Connection
APPNP
 APPNP with K-hop aggregation
Identity mapping Initial residual connection
6
GCNII Model
GCNII
7
GCNII Model
GCNII
𝑖𝑛𝑖𝑡𝑖𝑎𝑙𝑟𝑒𝑠𝑖𝑑𝑢𝑎𝑙𝑐𝑜𝑛𝑛𝑒𝑐𝑡𝑖𝑜𝑛 i
8
GCNII Model
GCNII
𝑖𝑛𝑖𝑡𝑖𝑎𝑙𝑟𝑒𝑠𝑖𝑑𝑢𝑎𝑙𝑐𝑜𝑛𝑛𝑒𝑐𝑡𝑖𝑜𝑛 i
GCNII*
9
Experiments
Dataset
 For -
• Citation networks: Cora, Citeseer,
Pubmed
 For -
• Chameleon
• Cornell
• Texas
• Wisconsin
 For
• PPI
10
Experiments
-
 Baselines
• GCN
• GAT
• APPNP
• JKNet
• JKNet(Drop)
• Incep(Drop)
11
Experiments
𝑨𝒅𝒆𝒕𝒂𝒊𝒍𝒆𝒅𝒄𝒐𝒎𝒑𝒂𝒓𝒊𝒔𝒐𝒏𝒘𝒊𝒕𝒉𝒐𝒕𝒉𝒆𝒓𝒅𝒆𝒆𝒑𝒎𝒐𝒅𝒆𝒍𝒔
12
Experiments
-
13
Experiments
𝑰𝒏𝒅𝒖𝒄𝒕𝒊𝒗𝒆 𝑳𝒆𝒂𝒓𝒏𝒊𝒏 𝒈
14
Experiments
-
15
Experiments
𝑨𝒃𝒍𝒂𝒕𝒊𝒐𝒏 𝑺𝒕𝒖𝒅 𝒚
16
Conclusion
Conclusion
 We propose GCNII, a simple and deep GCN model that prevents over-smoothing by initial residual
connection and identity mapping.
 Experiments show that the deep GCNII model achieves new state-of-the-art results on various semi- and
full-supervised tasks.
17
Q & A
Q / A

241125_JW_labseminar[Simple and Deep Graph Convolutional Networks​].pptx

Editor's Notes

  • #17 thank you, the presentation is concluded