Python Notes for mca i year students osmania university.docx
LightGCN: Simplifying and Powering Graph Convolution Network for Recommendation.pptx
1. Ho-Beom Kim
Network Science Lab
Dept. of Mathematics
The Catholic University of Korea
E-mail: hobeom2001@catholic.ac.kr
2023 / 11 / 13
HE, Xiangnan, et al.
ACM SIGIR 2020
2. 2
Introduction
Problem Statements
• NGCF is a model that deepens the use of the sub-graph structure with high-hop neighbors, and is a
model developed from GCN.
• NGCF specified user and item embedding through feature transformation, neighborhood aggregation,
and nonlinear activation based on the structure of GCN, but this structure was very heavy and
burdensome.
• In this paper, They proposed LightGCN, which overcomes these problems, and the results of comparing
NGCF and LightGCN are as follows. (Hop refers to a part of the path located between the source and
destination in the network structure, and in the graph, high-hop refers to how far away from the target
node.)
3. 3
Introduction
Contribution
1. They empirically show that two common designs in GCN, feature transformation and nonlinear
activation, have no positive effect on the effectiveness of collaborative filtering.
2. They propose LightGCN, which largely simplifies the model design by including only the most essential
components in GCN for recommendation.
3. They empirically compare LightGCN with NGCF by following the same setting and demonstrate
substantial improvements. In-depth analyses are provided towards the rationality of LightGCN from
both technical and empirical perspectives.
19. 19
Conclusions
Conclusions and Futureworks
• We proposed LightGCN which consists of two essential components — light graph convolution and
layer combination
• we discard feature transformation and nonlinear activation — two standard operations in GCNs but
inevitably increase the training difficulty.
• In layer combination, we construct a node’s final embedding as the weighted sum of its embeddings on
all layers, which is proved to subsume the effect of self-connections and is helpful to control
oversmoothing
Editor's Notes
Because the embedding of the last layer becomes over-smoothed as the number of layers increases, there is a problem with using only the last layer.Each layer captures different semantics. Therefore, a more comprehensive representation can be extracted by combining the results of capturing different information.By combining the embeddings of different layers through a weighted sum, the self-connected effect of existing Graph Convolution can be captured.