SlideShare a Scribd company logo
Joo-Ho Lee
School of Computer Science and Information Engineering,
The Catholic University of Korea
E-mail: jooho414@gmail.com
2023-09-25
1
Introduction
Problem Statement
• Existing GTs are exploited primarily for graph-level tasks (e.g., graph classification) with a small number of
nodes in a graph
• Developing GTs for node classification, where the number of nodes in a graph is relatively large (up to around
one million), remains a challenging proposition for the following two reasons.
2
Introduction
Problem Statement
• First, the quadratic computational complexity 𝑂 𝑛2
of self-attention in vanilla GTs, in regards to the number of
nodes, inhibits their application to node classification in real-world scenarios
• Second, vanilla GTs calculate the full connected attention and aggregate messages from arbitrary nodes,
including numerous irrelevant nodes
• This results in ambiguous attention weights and the aggregation of noise information from incorrectly correlated
nodes
3
Introduction
Problem Statement
• Only a few existing works have attempted to consider GTs for node classification.
• GT-sparse and SAN confine the receptive field of each node to its 1-hop neighboring nodes
4
Introduction
Problem Statement
• As a result, expressiveness is sacrificed when important interactions are multiple hops away, especially in the
large-scale graph correspondingly requiring a large receptive field
• Existing studies neglect unique characteristics of graph data and tend to yield dense attention, causing an
enormous amount of noise messages to be aggregated from irrelevant nodes
• In light of the above analysis, they propose Gapformer, which combines Graph Transformer with Graph Pooling,
to capture long-range dependencies and improve the efficiency of vanilla GTs
5
Introduction
Problem Statement
• In vanilla GTs, self-attention converts nodes into queries and keys/values, after which each query attends to all
the keys
• Specifically, self-attention involves computing the inner product between the query and key vectors to generate
attention scores
• These scores are then used to perform a weighted aggregation of value vectors
• To reduce the complexity of the dense inner product, Gapformer first utilizes graph pooling to group key and
value nodes into a smaller number of pooling nodes
6
Introduction
Problem Statement
• For graph pooling, we propose two types of strategies to compress the original graph efficiently and effectively
1. global graph pooling
2. local graph pooling
7
Introduction
Contribution
• They propose Gapformer, a deeper combination of Transformer and Graph Neural Networks
 Specifically, Gapformer utilizes Graph Pooling to group the attended nodes of each node into pooling
nodes (fewer in number) and computes its attention using only the pooling nodes.
• They conduct extensive experiments to compare Gapformer with 20 GNN and GT baseline models in the node
classification task on 13 real-world graph datasets, including homophilic and heterphilic datasets
8
Methodology
Architecture
9
Methodology
Attention Enhanced with Graph Pooling
𝐐 = 𝐇𝐖𝑄, 𝐊 = 𝐇𝐖𝐾, 𝐕 = 𝐇𝐖𝑉
𝐊𝒮 𝑖 = Pooling 𝐊
𝐕𝒮 𝑖 = Pooling 𝐕
𝒉𝑖 = 𝑠𝑜𝑓𝑡𝑚𝑎𝑥 𝛼𝑞𝑖
⊤
, 𝐊𝒮 𝑖 𝐕𝒮 𝑖
⊤
𝑧𝑖 = 𝑠𝑜𝑓𝑡𝑚𝑎𝑥 𝛼𝑞𝑖
⊤
, 𝐊𝒩 𝑖 𝐕𝒩 𝑖
⊤
𝒉𝑖
′
= 𝒉𝑖 + 𝛽 ∗ 𝑧𝑖
10
Methodology
Attention Enhanced with Global Graph Pooling (AGP-G)
𝐊𝒮
𝐺𝑙𝑜𝑏𝑎𝑙
= Pooling 𝐇𝐖𝐾, 𝐀
𝐕𝒮
𝐺𝑙𝑜𝑏𝑎𝑙
= Pooling 𝐇𝐖𝑉
, 𝐀
11
Methodology
Attention Enhanced with Global Graph Pooling (AGP-G)
𝐊𝒮 𝑖
𝐿𝑜𝑐𝑎𝑙
= Pooling 𝐊𝒩(𝑖,𝑘)
𝐕𝒮 𝑖
𝐿𝑜𝑐𝑎𝑙
= Pooling 𝐕𝒩(𝑖,𝑘)
12
Methodology
Merits of Gapformer
• Reducing Computational Complexity
• Reducing the Ratio of Noisy Connections
• Handling Long-range Dependency
13
Experiment
Datasets
14
Experiment
Node Classification on homogeneous datasets
15
Experiment
Node Classification on heterophilic datasets
16
Experiment
Comparison of training time and GPU memory costs
17
Experiment
Ablation Study : Components of Graph Transformer architecture
18
Experiment
Ablation Study : Different Pooling Types
19
Experiment
Ablation Study : Different parameters
20
Conclusion
• In this paper, they propose Gapformer, which combines Graph Transformers (GTs) with Graph Pooling for
efficient node classification
• Their Gapformer addresses the two main issues of existing GTs
• potential noises from long-distance neighbors
• the quadratic computational complexity in regards to the number of nodes
• Extensive experiments on 13 graph datasets demonstrate that Gapformer outperforms existing GTs and Graph
Neural Networks
• Despite its competitive performance, Gapformer still has room for improvement
• devising an effective manner to combine the proposed local pooling enhanced attention and global pooling
enhanced attention
• incorporating useful techniques to further enhance the performance on large-scale graph datasets
Graph Transformer with Graph Pooling for Node Classification, IJCAI 2023.pptx

More Related Content

Similar to Graph Transformer with Graph Pooling for Node Classification, IJCAI 2023.pptx

"Sparse Graph Attention Networks", IEEE Transactions on Knowledge and Data En...
"Sparse Graph Attention Networks", IEEE Transactions on Knowledge and Data En..."Sparse Graph Attention Networks", IEEE Transactions on Knowledge and Data En...
"Sparse Graph Attention Networks", IEEE Transactions on Knowledge and Data En...
ssuser2624f71
 
NS - CUK Seminar: J.H.Lee, Review on "How Attentive are Graph Attention Netw...
NS - CUK Seminar: J.H.Lee,  Review on "How Attentive are Graph Attention Netw...NS - CUK Seminar: J.H.Lee,  Review on "How Attentive are Graph Attention Netw...
NS - CUK Seminar: J.H.Lee, Review on "How Attentive are Graph Attention Netw...
ssuser4b1f48
 
GoogLeNet.pptx
GoogLeNet.pptxGoogLeNet.pptx
GoogLeNet.pptx
ssuser2624f71
 
Volume 2-issue-6-2200-2204
Volume 2-issue-6-2200-2204Volume 2-issue-6-2200-2204
Volume 2-issue-6-2200-2204Editor IJARCET
 
Volume 2-issue-6-2200-2204
Volume 2-issue-6-2200-2204Volume 2-issue-6-2200-2204
Volume 2-issue-6-2200-2204Editor IJARCET
 
NS-CUK Seminar: S.T.Nguyen, Review on "Hierarchical Graph Convolutional Netwo...
NS-CUK Seminar: S.T.Nguyen, Review on "Hierarchical Graph Convolutional Netwo...NS-CUK Seminar: S.T.Nguyen, Review on "Hierarchical Graph Convolutional Netwo...
NS-CUK Seminar: S.T.Nguyen, Review on "Hierarchical Graph Convolutional Netwo...
ssuser4b1f48
 
NS-CUK Seminar: J.H.Lee, Review on "Recipe for a General, Powerful, Scalable...
NS-CUK Seminar: J.H.Lee,  Review on "Recipe for a General, Powerful, Scalable...NS-CUK Seminar: J.H.Lee,  Review on "Recipe for a General, Powerful, Scalable...
NS-CUK Seminar: J.H.Lee, Review on "Recipe for a General, Powerful, Scalable...
ssuser4b1f48
 
Learning global pooling operators in deep neural networks for image retrieval...
Learning global pooling operators in deep neural networks for image retrieval...Learning global pooling operators in deep neural networks for image retrieval...
Learning global pooling operators in deep neural networks for image retrieval...
Erlangen Artificial Intelligence & Machine Learning Meetup
 
240429_Thuy_Labseminar[Simplifying and Empowering Transformers for Large-Grap...
240429_Thuy_Labseminar[Simplifying and Empowering Transformers for Large-Grap...240429_Thuy_Labseminar[Simplifying and Empowering Transformers for Large-Grap...
240429_Thuy_Labseminar[Simplifying and Empowering Transformers for Large-Grap...
thanhdowork
 
Graph convolutional neural networks for web-scale recommender systems.pptx
Graph convolutional neural networks for web-scale recommender systems.pptxGraph convolutional neural networks for web-scale recommender systems.pptx
Graph convolutional neural networks for web-scale recommender systems.pptx
ssuser2624f71
 
Mining quasi bicliques using giraph
Mining quasi bicliques using giraphMining quasi bicliques using giraph
Mining quasi bicliques using giraph
Hsiao-Fei Liu
 
NS-CUK Seminar: S.T.Nguyen, Review on "Hierarchical Graph Transformer with Ad...
NS-CUK Seminar: S.T.Nguyen, Review on "Hierarchical Graph Transformer with Ad...NS-CUK Seminar: S.T.Nguyen, Review on "Hierarchical Graph Transformer with Ad...
NS-CUK Seminar: S.T.Nguyen, Review on "Hierarchical Graph Transformer with Ad...
ssuser4b1f48
 
EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks.pptx
EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks.pptxEfficientNet: Rethinking Model Scaling for Convolutional Neural Networks.pptx
EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks.pptx
ssuser2624f71
 
hint co hint-based configuration of co-simulations
hint co hint-based configuration of co-simulationshint co hint-based configuration of co-simulations
hint co hint-based configuration of co-simulations
mehmor
 
Fahroo - Computational Mathematics - Spring Review 2012
Fahroo - Computational Mathematics - Spring Review 2012 Fahroo - Computational Mathematics - Spring Review 2012
Fahroo - Computational Mathematics - Spring Review 2012
The Air Force Office of Scientific Research
 
NS-CUK Seminar: J.H.Lee, Review on "GCC: Graph Contrastive Coding for Graph ...
NS-CUK Seminar: J.H.Lee,  Review on "GCC: Graph Contrastive Coding for Graph ...NS-CUK Seminar: J.H.Lee,  Review on "GCC: Graph Contrastive Coding for Graph ...
NS-CUK Seminar: J.H.Lee, Review on "GCC: Graph Contrastive Coding for Graph ...
ssuser4b1f48
 
Set Transfomer: A Framework for Attention-based Permutaion-Invariant Neural N...
Set Transfomer: A Framework for Attention-based Permutaion-Invariant Neural N...Set Transfomer: A Framework for Attention-based Permutaion-Invariant Neural N...
Set Transfomer: A Framework for Attention-based Permutaion-Invariant Neural N...
Thien Q. Tran
 
ResNeSt: Split-Attention Networks
ResNeSt: Split-Attention NetworksResNeSt: Split-Attention Networks
ResNeSt: Split-Attention Networks
Seunghyun Hwang
 
NS-CUK Joint Jouarl Club: JHLee, Review on "GraphMAE: Self-Supervised Masked...
 NS-CUK Joint Jouarl Club: JHLee, Review on "GraphMAE: Self-Supervised Masked... NS-CUK Joint Jouarl Club: JHLee, Review on "GraphMAE: Self-Supervised Masked...
NS-CUK Joint Jouarl Club: JHLee, Review on "GraphMAE: Self-Supervised Masked...
ssuser4b1f48
 
IRJET- Approximate Multiplier and 8 Bit Dadda Multiplier Implemented through ...
IRJET- Approximate Multiplier and 8 Bit Dadda Multiplier Implemented through ...IRJET- Approximate Multiplier and 8 Bit Dadda Multiplier Implemented through ...
IRJET- Approximate Multiplier and 8 Bit Dadda Multiplier Implemented through ...
IRJET Journal
 

Similar to Graph Transformer with Graph Pooling for Node Classification, IJCAI 2023.pptx (20)

"Sparse Graph Attention Networks", IEEE Transactions on Knowledge and Data En...
"Sparse Graph Attention Networks", IEEE Transactions on Knowledge and Data En..."Sparse Graph Attention Networks", IEEE Transactions on Knowledge and Data En...
"Sparse Graph Attention Networks", IEEE Transactions on Knowledge and Data En...
 
NS - CUK Seminar: J.H.Lee, Review on "How Attentive are Graph Attention Netw...
NS - CUK Seminar: J.H.Lee,  Review on "How Attentive are Graph Attention Netw...NS - CUK Seminar: J.H.Lee,  Review on "How Attentive are Graph Attention Netw...
NS - CUK Seminar: J.H.Lee, Review on "How Attentive are Graph Attention Netw...
 
GoogLeNet.pptx
GoogLeNet.pptxGoogLeNet.pptx
GoogLeNet.pptx
 
Volume 2-issue-6-2200-2204
Volume 2-issue-6-2200-2204Volume 2-issue-6-2200-2204
Volume 2-issue-6-2200-2204
 
Volume 2-issue-6-2200-2204
Volume 2-issue-6-2200-2204Volume 2-issue-6-2200-2204
Volume 2-issue-6-2200-2204
 
NS-CUK Seminar: S.T.Nguyen, Review on "Hierarchical Graph Convolutional Netwo...
NS-CUK Seminar: S.T.Nguyen, Review on "Hierarchical Graph Convolutional Netwo...NS-CUK Seminar: S.T.Nguyen, Review on "Hierarchical Graph Convolutional Netwo...
NS-CUK Seminar: S.T.Nguyen, Review on "Hierarchical Graph Convolutional Netwo...
 
NS-CUK Seminar: J.H.Lee, Review on "Recipe for a General, Powerful, Scalable...
NS-CUK Seminar: J.H.Lee,  Review on "Recipe for a General, Powerful, Scalable...NS-CUK Seminar: J.H.Lee,  Review on "Recipe for a General, Powerful, Scalable...
NS-CUK Seminar: J.H.Lee, Review on "Recipe for a General, Powerful, Scalable...
 
Learning global pooling operators in deep neural networks for image retrieval...
Learning global pooling operators in deep neural networks for image retrieval...Learning global pooling operators in deep neural networks for image retrieval...
Learning global pooling operators in deep neural networks for image retrieval...
 
240429_Thuy_Labseminar[Simplifying and Empowering Transformers for Large-Grap...
240429_Thuy_Labseminar[Simplifying and Empowering Transformers for Large-Grap...240429_Thuy_Labseminar[Simplifying and Empowering Transformers for Large-Grap...
240429_Thuy_Labseminar[Simplifying and Empowering Transformers for Large-Grap...
 
Graph convolutional neural networks for web-scale recommender systems.pptx
Graph convolutional neural networks for web-scale recommender systems.pptxGraph convolutional neural networks for web-scale recommender systems.pptx
Graph convolutional neural networks for web-scale recommender systems.pptx
 
Mining quasi bicliques using giraph
Mining quasi bicliques using giraphMining quasi bicliques using giraph
Mining quasi bicliques using giraph
 
NS-CUK Seminar: S.T.Nguyen, Review on "Hierarchical Graph Transformer with Ad...
NS-CUK Seminar: S.T.Nguyen, Review on "Hierarchical Graph Transformer with Ad...NS-CUK Seminar: S.T.Nguyen, Review on "Hierarchical Graph Transformer with Ad...
NS-CUK Seminar: S.T.Nguyen, Review on "Hierarchical Graph Transformer with Ad...
 
EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks.pptx
EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks.pptxEfficientNet: Rethinking Model Scaling for Convolutional Neural Networks.pptx
EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks.pptx
 
hint co hint-based configuration of co-simulations
hint co hint-based configuration of co-simulationshint co hint-based configuration of co-simulations
hint co hint-based configuration of co-simulations
 
Fahroo - Computational Mathematics - Spring Review 2012
Fahroo - Computational Mathematics - Spring Review 2012 Fahroo - Computational Mathematics - Spring Review 2012
Fahroo - Computational Mathematics - Spring Review 2012
 
NS-CUK Seminar: J.H.Lee, Review on "GCC: Graph Contrastive Coding for Graph ...
NS-CUK Seminar: J.H.Lee,  Review on "GCC: Graph Contrastive Coding for Graph ...NS-CUK Seminar: J.H.Lee,  Review on "GCC: Graph Contrastive Coding for Graph ...
NS-CUK Seminar: J.H.Lee, Review on "GCC: Graph Contrastive Coding for Graph ...
 
Set Transfomer: A Framework for Attention-based Permutaion-Invariant Neural N...
Set Transfomer: A Framework for Attention-based Permutaion-Invariant Neural N...Set Transfomer: A Framework for Attention-based Permutaion-Invariant Neural N...
Set Transfomer: A Framework for Attention-based Permutaion-Invariant Neural N...
 
ResNeSt: Split-Attention Networks
ResNeSt: Split-Attention NetworksResNeSt: Split-Attention Networks
ResNeSt: Split-Attention Networks
 
NS-CUK Joint Jouarl Club: JHLee, Review on "GraphMAE: Self-Supervised Masked...
 NS-CUK Joint Jouarl Club: JHLee, Review on "GraphMAE: Self-Supervised Masked... NS-CUK Joint Jouarl Club: JHLee, Review on "GraphMAE: Self-Supervised Masked...
NS-CUK Joint Jouarl Club: JHLee, Review on "GraphMAE: Self-Supervised Masked...
 
IRJET- Approximate Multiplier and 8 Bit Dadda Multiplier Implemented through ...
IRJET- Approximate Multiplier and 8 Bit Dadda Multiplier Implemented through ...IRJET- Approximate Multiplier and 8 Bit Dadda Multiplier Implemented through ...
IRJET- Approximate Multiplier and 8 Bit Dadda Multiplier Implemented through ...
 

More from ssuser2624f71

Vector and Matrix operationsVector and Matrix operations
Vector and Matrix operationsVector and Matrix operationsVector and Matrix operationsVector and Matrix operations
Vector and Matrix operationsVector and Matrix operations
ssuser2624f71
 
240219_RNN, LSTM code.pptxdddddddddddddddd
240219_RNN, LSTM code.pptxdddddddddddddddd240219_RNN, LSTM code.pptxdddddddddddddddd
240219_RNN, LSTM code.pptxdddddddddddddddd
ssuser2624f71
 
인공지능 로봇 윤리_1229_9차시.pptx
인공지능 로봇 윤리_1229_9차시.pptx인공지능 로봇 윤리_1229_9차시.pptx
인공지능 로봇 윤리_1229_9차시.pptx
ssuser2624f71
 
인공지능 로봇 윤리_1228_8차시.pptx
인공지능 로봇 윤리_1228_8차시.pptx인공지능 로봇 윤리_1228_8차시.pptx
인공지능 로봇 윤리_1228_8차시.pptx
ssuser2624f71
 
인공지능 로봇 윤리_1227_7차시.pptx
인공지능 로봇 윤리_1227_7차시.pptx인공지능 로봇 윤리_1227_7차시.pptx
인공지능 로봇 윤리_1227_7차시.pptx
ssuser2624f71
 
인공지능 로봇 윤리_1226_6차시.pptx
인공지능 로봇 윤리_1226_6차시.pptx인공지능 로봇 윤리_1226_6차시.pptx
인공지능 로봇 윤리_1226_6차시.pptx
ssuser2624f71
 
인공지능 로봇 윤리_1222_5차시.pptx
인공지능 로봇 윤리_1222_5차시.pptx인공지능 로봇 윤리_1222_5차시.pptx
인공지능 로봇 윤리_1222_5차시.pptx
ssuser2624f71
 
인공지능 로봇 윤리_1221_4차시.pptx
인공지능 로봇 윤리_1221_4차시.pptx인공지능 로봇 윤리_1221_4차시.pptx
인공지능 로봇 윤리_1221_4차시.pptx
ssuser2624f71
 
인공지능 로봇 윤리_1220_3차시.pptx
인공지능 로봇 윤리_1220_3차시.pptx인공지능 로봇 윤리_1220_3차시.pptx
인공지능 로봇 윤리_1220_3차시.pptx
ssuser2624f71
 
인공지능 로봇 윤리_1219_2차시.pptx
인공지능 로봇 윤리_1219_2차시.pptx인공지능 로봇 윤리_1219_2차시.pptx
인공지능 로봇 윤리_1219_2차시.pptx
ssuser2624f71
 
인공지능 로봇 윤리_1218_1차시.pptx
인공지능 로봇 윤리_1218_1차시.pptx인공지능 로봇 윤리_1218_1차시.pptx
인공지능 로봇 윤리_1218_1차시.pptx
ssuser2624f71
 
디지털인문학9차시.pptx
디지털인문학9차시.pptx디지털인문학9차시.pptx
디지털인문학9차시.pptx
ssuser2624f71
 
디지털인문학8차시.pptx
디지털인문학8차시.pptx디지털인문학8차시.pptx
디지털인문학8차시.pptx
ssuser2624f71
 
디지털인문학7차시.pptx
디지털인문학7차시.pptx디지털인문학7차시.pptx
디지털인문학7차시.pptx
ssuser2624f71
 
디지털인문학6차시.pptx
디지털인문학6차시.pptx디지털인문학6차시.pptx
디지털인문학6차시.pptx
ssuser2624f71
 
디지털인문학 5차시.pptx
디지털인문학 5차시.pptx디지털인문학 5차시.pptx
디지털인문학 5차시.pptx
ssuser2624f71
 
디지털인문학4차시.pptx
디지털인문학4차시.pptx디지털인문학4차시.pptx
디지털인문학4차시.pptx
ssuser2624f71
 
디지털인문학3차시.pptx
디지털인문학3차시.pptx디지털인문학3차시.pptx
디지털인문학3차시.pptx
ssuser2624f71
 
디지털인문학2차시.pptx
디지털인문학2차시.pptx디지털인문학2차시.pptx
디지털인문학2차시.pptx
ssuser2624f71
 
디지털인문학1차시.pptx
디지털인문학1차시.pptx디지털인문학1차시.pptx
디지털인문학1차시.pptx
ssuser2624f71
 

More from ssuser2624f71 (20)

Vector and Matrix operationsVector and Matrix operations
Vector and Matrix operationsVector and Matrix operationsVector and Matrix operationsVector and Matrix operations
Vector and Matrix operationsVector and Matrix operations
 
240219_RNN, LSTM code.pptxdddddddddddddddd
240219_RNN, LSTM code.pptxdddddddddddddddd240219_RNN, LSTM code.pptxdddddddddddddddd
240219_RNN, LSTM code.pptxdddddddddddddddd
 
인공지능 로봇 윤리_1229_9차시.pptx
인공지능 로봇 윤리_1229_9차시.pptx인공지능 로봇 윤리_1229_9차시.pptx
인공지능 로봇 윤리_1229_9차시.pptx
 
인공지능 로봇 윤리_1228_8차시.pptx
인공지능 로봇 윤리_1228_8차시.pptx인공지능 로봇 윤리_1228_8차시.pptx
인공지능 로봇 윤리_1228_8차시.pptx
 
인공지능 로봇 윤리_1227_7차시.pptx
인공지능 로봇 윤리_1227_7차시.pptx인공지능 로봇 윤리_1227_7차시.pptx
인공지능 로봇 윤리_1227_7차시.pptx
 
인공지능 로봇 윤리_1226_6차시.pptx
인공지능 로봇 윤리_1226_6차시.pptx인공지능 로봇 윤리_1226_6차시.pptx
인공지능 로봇 윤리_1226_6차시.pptx
 
인공지능 로봇 윤리_1222_5차시.pptx
인공지능 로봇 윤리_1222_5차시.pptx인공지능 로봇 윤리_1222_5차시.pptx
인공지능 로봇 윤리_1222_5차시.pptx
 
인공지능 로봇 윤리_1221_4차시.pptx
인공지능 로봇 윤리_1221_4차시.pptx인공지능 로봇 윤리_1221_4차시.pptx
인공지능 로봇 윤리_1221_4차시.pptx
 
인공지능 로봇 윤리_1220_3차시.pptx
인공지능 로봇 윤리_1220_3차시.pptx인공지능 로봇 윤리_1220_3차시.pptx
인공지능 로봇 윤리_1220_3차시.pptx
 
인공지능 로봇 윤리_1219_2차시.pptx
인공지능 로봇 윤리_1219_2차시.pptx인공지능 로봇 윤리_1219_2차시.pptx
인공지능 로봇 윤리_1219_2차시.pptx
 
인공지능 로봇 윤리_1218_1차시.pptx
인공지능 로봇 윤리_1218_1차시.pptx인공지능 로봇 윤리_1218_1차시.pptx
인공지능 로봇 윤리_1218_1차시.pptx
 
디지털인문학9차시.pptx
디지털인문학9차시.pptx디지털인문학9차시.pptx
디지털인문학9차시.pptx
 
디지털인문학8차시.pptx
디지털인문학8차시.pptx디지털인문학8차시.pptx
디지털인문학8차시.pptx
 
디지털인문학7차시.pptx
디지털인문학7차시.pptx디지털인문학7차시.pptx
디지털인문학7차시.pptx
 
디지털인문학6차시.pptx
디지털인문학6차시.pptx디지털인문학6차시.pptx
디지털인문학6차시.pptx
 
디지털인문학 5차시.pptx
디지털인문학 5차시.pptx디지털인문학 5차시.pptx
디지털인문학 5차시.pptx
 
디지털인문학4차시.pptx
디지털인문학4차시.pptx디지털인문학4차시.pptx
디지털인문학4차시.pptx
 
디지털인문학3차시.pptx
디지털인문학3차시.pptx디지털인문학3차시.pptx
디지털인문학3차시.pptx
 
디지털인문학2차시.pptx
디지털인문학2차시.pptx디지털인문학2차시.pptx
디지털인문학2차시.pptx
 
디지털인문학1차시.pptx
디지털인문학1차시.pptx디지털인문학1차시.pptx
디지털인문학1차시.pptx
 

Recently uploaded

Supporting (UKRI) OA monographs at Salford.pptx
Supporting (UKRI) OA monographs at Salford.pptxSupporting (UKRI) OA monographs at Salford.pptx
Supporting (UKRI) OA monographs at Salford.pptx
Jisc
 
STRAND 3 HYGIENIC PRACTICES.pptx GRADE 7 CBC
STRAND 3 HYGIENIC PRACTICES.pptx GRADE 7 CBCSTRAND 3 HYGIENIC PRACTICES.pptx GRADE 7 CBC
STRAND 3 HYGIENIC PRACTICES.pptx GRADE 7 CBC
kimdan468
 
The French Revolution Class 9 Study Material pdf free download
The French Revolution Class 9 Study Material pdf free downloadThe French Revolution Class 9 Study Material pdf free download
The French Revolution Class 9 Study Material pdf free download
Vivekanand Anglo Vedic Academy
 
CACJapan - GROUP Presentation 1- Wk 4.pdf
CACJapan - GROUP Presentation 1- Wk 4.pdfCACJapan - GROUP Presentation 1- Wk 4.pdf
CACJapan - GROUP Presentation 1- Wk 4.pdf
camakaiclarkmusic
 
A Survey of Techniques for Maximizing LLM Performance.pptx
A Survey of Techniques for Maximizing LLM Performance.pptxA Survey of Techniques for Maximizing LLM Performance.pptx
A Survey of Techniques for Maximizing LLM Performance.pptx
thanhdowork
 
1.4 modern child centered education - mahatma gandhi-2.pptx
1.4 modern child centered education - mahatma gandhi-2.pptx1.4 modern child centered education - mahatma gandhi-2.pptx
1.4 modern child centered education - mahatma gandhi-2.pptx
JosvitaDsouza2
 
Unit 2- Research Aptitude (UGC NET Paper I).pdf
Unit 2- Research Aptitude (UGC NET Paper I).pdfUnit 2- Research Aptitude (UGC NET Paper I).pdf
Unit 2- Research Aptitude (UGC NET Paper I).pdf
Thiyagu K
 
Normal Labour/ Stages of Labour/ Mechanism of Labour
Normal Labour/ Stages of Labour/ Mechanism of LabourNormal Labour/ Stages of Labour/ Mechanism of Labour
Normal Labour/ Stages of Labour/ Mechanism of Labour
Wasim Ak
 
The Diamond Necklace by Guy De Maupassant.pptx
The Diamond Necklace by Guy De Maupassant.pptxThe Diamond Necklace by Guy De Maupassant.pptx
The Diamond Necklace by Guy De Maupassant.pptx
DhatriParmar
 
Language Across the Curriculm LAC B.Ed.
Language Across the  Curriculm LAC B.Ed.Language Across the  Curriculm LAC B.Ed.
Language Across the Curriculm LAC B.Ed.
Atul Kumar Singh
 
Best Digital Marketing Institute In NOIDA
Best Digital Marketing Institute In NOIDABest Digital Marketing Institute In NOIDA
Best Digital Marketing Institute In NOIDA
deeptiverma2406
 
Azure Interview Questions and Answers PDF By ScholarHat
Azure Interview Questions and Answers PDF By ScholarHatAzure Interview Questions and Answers PDF By ScholarHat
Azure Interview Questions and Answers PDF By ScholarHat
Scholarhat
 
Digital Artifact 2 - Investigating Pavilion Designs
Digital Artifact 2 - Investigating Pavilion DesignsDigital Artifact 2 - Investigating Pavilion Designs
Digital Artifact 2 - Investigating Pavilion Designs
chanes7
 
Chapter -12, Antibiotics (One Page Notes).pdf
Chapter -12, Antibiotics (One Page Notes).pdfChapter -12, Antibiotics (One Page Notes).pdf
Chapter -12, Antibiotics (One Page Notes).pdf
Kartik Tiwari
 
2024.06.01 Introducing a competency framework for languag learning materials ...
2024.06.01 Introducing a competency framework for languag learning materials ...2024.06.01 Introducing a competency framework for languag learning materials ...
2024.06.01 Introducing a competency framework for languag learning materials ...
Sandy Millin
 
Pride Month Slides 2024 David Douglas School District
Pride Month Slides 2024 David Douglas School DistrictPride Month Slides 2024 David Douglas School District
Pride Month Slides 2024 David Douglas School District
David Douglas School District
 
Multithreading_in_C++ - std::thread, race condition
Multithreading_in_C++ - std::thread, race conditionMultithreading_in_C++ - std::thread, race condition
Multithreading_in_C++ - std::thread, race condition
Mohammed Sikander
 
Introduction to AI for Nonprofits with Tapp Network
Introduction to AI for Nonprofits with Tapp NetworkIntroduction to AI for Nonprofits with Tapp Network
Introduction to AI for Nonprofits with Tapp Network
TechSoup
 
special B.ed 2nd year old paper_20240531.pdf
special B.ed 2nd year old paper_20240531.pdfspecial B.ed 2nd year old paper_20240531.pdf
special B.ed 2nd year old paper_20240531.pdf
Special education needs
 
The Accursed House by Émile Gaboriau.pptx
The Accursed House by Émile Gaboriau.pptxThe Accursed House by Émile Gaboriau.pptx
The Accursed House by Émile Gaboriau.pptx
DhatriParmar
 

Recently uploaded (20)

Supporting (UKRI) OA monographs at Salford.pptx
Supporting (UKRI) OA monographs at Salford.pptxSupporting (UKRI) OA monographs at Salford.pptx
Supporting (UKRI) OA monographs at Salford.pptx
 
STRAND 3 HYGIENIC PRACTICES.pptx GRADE 7 CBC
STRAND 3 HYGIENIC PRACTICES.pptx GRADE 7 CBCSTRAND 3 HYGIENIC PRACTICES.pptx GRADE 7 CBC
STRAND 3 HYGIENIC PRACTICES.pptx GRADE 7 CBC
 
The French Revolution Class 9 Study Material pdf free download
The French Revolution Class 9 Study Material pdf free downloadThe French Revolution Class 9 Study Material pdf free download
The French Revolution Class 9 Study Material pdf free download
 
CACJapan - GROUP Presentation 1- Wk 4.pdf
CACJapan - GROUP Presentation 1- Wk 4.pdfCACJapan - GROUP Presentation 1- Wk 4.pdf
CACJapan - GROUP Presentation 1- Wk 4.pdf
 
A Survey of Techniques for Maximizing LLM Performance.pptx
A Survey of Techniques for Maximizing LLM Performance.pptxA Survey of Techniques for Maximizing LLM Performance.pptx
A Survey of Techniques for Maximizing LLM Performance.pptx
 
1.4 modern child centered education - mahatma gandhi-2.pptx
1.4 modern child centered education - mahatma gandhi-2.pptx1.4 modern child centered education - mahatma gandhi-2.pptx
1.4 modern child centered education - mahatma gandhi-2.pptx
 
Unit 2- Research Aptitude (UGC NET Paper I).pdf
Unit 2- Research Aptitude (UGC NET Paper I).pdfUnit 2- Research Aptitude (UGC NET Paper I).pdf
Unit 2- Research Aptitude (UGC NET Paper I).pdf
 
Normal Labour/ Stages of Labour/ Mechanism of Labour
Normal Labour/ Stages of Labour/ Mechanism of LabourNormal Labour/ Stages of Labour/ Mechanism of Labour
Normal Labour/ Stages of Labour/ Mechanism of Labour
 
The Diamond Necklace by Guy De Maupassant.pptx
The Diamond Necklace by Guy De Maupassant.pptxThe Diamond Necklace by Guy De Maupassant.pptx
The Diamond Necklace by Guy De Maupassant.pptx
 
Language Across the Curriculm LAC B.Ed.
Language Across the  Curriculm LAC B.Ed.Language Across the  Curriculm LAC B.Ed.
Language Across the Curriculm LAC B.Ed.
 
Best Digital Marketing Institute In NOIDA
Best Digital Marketing Institute In NOIDABest Digital Marketing Institute In NOIDA
Best Digital Marketing Institute In NOIDA
 
Azure Interview Questions and Answers PDF By ScholarHat
Azure Interview Questions and Answers PDF By ScholarHatAzure Interview Questions and Answers PDF By ScholarHat
Azure Interview Questions and Answers PDF By ScholarHat
 
Digital Artifact 2 - Investigating Pavilion Designs
Digital Artifact 2 - Investigating Pavilion DesignsDigital Artifact 2 - Investigating Pavilion Designs
Digital Artifact 2 - Investigating Pavilion Designs
 
Chapter -12, Antibiotics (One Page Notes).pdf
Chapter -12, Antibiotics (One Page Notes).pdfChapter -12, Antibiotics (One Page Notes).pdf
Chapter -12, Antibiotics (One Page Notes).pdf
 
2024.06.01 Introducing a competency framework for languag learning materials ...
2024.06.01 Introducing a competency framework for languag learning materials ...2024.06.01 Introducing a competency framework for languag learning materials ...
2024.06.01 Introducing a competency framework for languag learning materials ...
 
Pride Month Slides 2024 David Douglas School District
Pride Month Slides 2024 David Douglas School DistrictPride Month Slides 2024 David Douglas School District
Pride Month Slides 2024 David Douglas School District
 
Multithreading_in_C++ - std::thread, race condition
Multithreading_in_C++ - std::thread, race conditionMultithreading_in_C++ - std::thread, race condition
Multithreading_in_C++ - std::thread, race condition
 
Introduction to AI for Nonprofits with Tapp Network
Introduction to AI for Nonprofits with Tapp NetworkIntroduction to AI for Nonprofits with Tapp Network
Introduction to AI for Nonprofits with Tapp Network
 
special B.ed 2nd year old paper_20240531.pdf
special B.ed 2nd year old paper_20240531.pdfspecial B.ed 2nd year old paper_20240531.pdf
special B.ed 2nd year old paper_20240531.pdf
 
The Accursed House by Émile Gaboriau.pptx
The Accursed House by Émile Gaboriau.pptxThe Accursed House by Émile Gaboriau.pptx
The Accursed House by Émile Gaboriau.pptx
 

Graph Transformer with Graph Pooling for Node Classification, IJCAI 2023.pptx

  • 1. Joo-Ho Lee School of Computer Science and Information Engineering, The Catholic University of Korea E-mail: jooho414@gmail.com 2023-09-25
  • 2. 1 Introduction Problem Statement • Existing GTs are exploited primarily for graph-level tasks (e.g., graph classification) with a small number of nodes in a graph • Developing GTs for node classification, where the number of nodes in a graph is relatively large (up to around one million), remains a challenging proposition for the following two reasons.
  • 3. 2 Introduction Problem Statement • First, the quadratic computational complexity 𝑂 𝑛2 of self-attention in vanilla GTs, in regards to the number of nodes, inhibits their application to node classification in real-world scenarios • Second, vanilla GTs calculate the full connected attention and aggregate messages from arbitrary nodes, including numerous irrelevant nodes • This results in ambiguous attention weights and the aggregation of noise information from incorrectly correlated nodes
  • 4. 3 Introduction Problem Statement • Only a few existing works have attempted to consider GTs for node classification. • GT-sparse and SAN confine the receptive field of each node to its 1-hop neighboring nodes
  • 5. 4 Introduction Problem Statement • As a result, expressiveness is sacrificed when important interactions are multiple hops away, especially in the large-scale graph correspondingly requiring a large receptive field • Existing studies neglect unique characteristics of graph data and tend to yield dense attention, causing an enormous amount of noise messages to be aggregated from irrelevant nodes • In light of the above analysis, they propose Gapformer, which combines Graph Transformer with Graph Pooling, to capture long-range dependencies and improve the efficiency of vanilla GTs
  • 6. 5 Introduction Problem Statement • In vanilla GTs, self-attention converts nodes into queries and keys/values, after which each query attends to all the keys • Specifically, self-attention involves computing the inner product between the query and key vectors to generate attention scores • These scores are then used to perform a weighted aggregation of value vectors • To reduce the complexity of the dense inner product, Gapformer first utilizes graph pooling to group key and value nodes into a smaller number of pooling nodes
  • 7. 6 Introduction Problem Statement • For graph pooling, we propose two types of strategies to compress the original graph efficiently and effectively 1. global graph pooling 2. local graph pooling
  • 8. 7 Introduction Contribution • They propose Gapformer, a deeper combination of Transformer and Graph Neural Networks  Specifically, Gapformer utilizes Graph Pooling to group the attended nodes of each node into pooling nodes (fewer in number) and computes its attention using only the pooling nodes. • They conduct extensive experiments to compare Gapformer with 20 GNN and GT baseline models in the node classification task on 13 real-world graph datasets, including homophilic and heterphilic datasets
  • 10. 9 Methodology Attention Enhanced with Graph Pooling 𝐐 = 𝐇𝐖𝑄, 𝐊 = 𝐇𝐖𝐾, 𝐕 = 𝐇𝐖𝑉 𝐊𝒮 𝑖 = Pooling 𝐊 𝐕𝒮 𝑖 = Pooling 𝐕 𝒉𝑖 = 𝑠𝑜𝑓𝑡𝑚𝑎𝑥 𝛼𝑞𝑖 ⊤ , 𝐊𝒮 𝑖 𝐕𝒮 𝑖 ⊤ 𝑧𝑖 = 𝑠𝑜𝑓𝑡𝑚𝑎𝑥 𝛼𝑞𝑖 ⊤ , 𝐊𝒩 𝑖 𝐕𝒩 𝑖 ⊤ 𝒉𝑖 ′ = 𝒉𝑖 + 𝛽 ∗ 𝑧𝑖
  • 11. 10 Methodology Attention Enhanced with Global Graph Pooling (AGP-G) 𝐊𝒮 𝐺𝑙𝑜𝑏𝑎𝑙 = Pooling 𝐇𝐖𝐾, 𝐀 𝐕𝒮 𝐺𝑙𝑜𝑏𝑎𝑙 = Pooling 𝐇𝐖𝑉 , 𝐀
  • 12. 11 Methodology Attention Enhanced with Global Graph Pooling (AGP-G) 𝐊𝒮 𝑖 𝐿𝑜𝑐𝑎𝑙 = Pooling 𝐊𝒩(𝑖,𝑘) 𝐕𝒮 𝑖 𝐿𝑜𝑐𝑎𝑙 = Pooling 𝐕𝒩(𝑖,𝑘)
  • 13. 12 Methodology Merits of Gapformer • Reducing Computational Complexity • Reducing the Ratio of Noisy Connections • Handling Long-range Dependency
  • 15. 14 Experiment Node Classification on homogeneous datasets
  • 16. 15 Experiment Node Classification on heterophilic datasets
  • 17. 16 Experiment Comparison of training time and GPU memory costs
  • 18. 17 Experiment Ablation Study : Components of Graph Transformer architecture
  • 19. 18 Experiment Ablation Study : Different Pooling Types
  • 20. 19 Experiment Ablation Study : Different parameters
  • 21. 20 Conclusion • In this paper, they propose Gapformer, which combines Graph Transformers (GTs) with Graph Pooling for efficient node classification • Their Gapformer addresses the two main issues of existing GTs • potential noises from long-distance neighbors • the quadratic computational complexity in regards to the number of nodes • Extensive experiments on 13 graph datasets demonstrate that Gapformer outperforms existing GTs and Graph Neural Networks • Despite its competitive performance, Gapformer still has room for improvement • devising an effective manner to combine the proposed local pooling enhanced attention and global pooling enhanced attention • incorporating useful techniques to further enhance the performance on large-scale graph datasets

Editor's Notes

  1. 이미 이전에 propagation을 활용하여 rumor detection 하는 논문들을 모두 봤었다.
  2. 이미 이전에 propagation을 활용하여 rumor detection 하는 논문들을 모두 봤었다.
  3. 이미 이전에 propagation을 활용하여 rumor detection 하는 논문들을 모두 봤었다.
  4. 이미 이전에 propagation을 활용하여 rumor detection 하는 논문들을 모두 봤었다.
  5. 이미 이전에 propagation을 활용하여 rumor detection 하는 논문들을 모두 봤었다.
  6. 이미 이전에 propagation을 활용하여 rumor detection 하는 논문들을 모두 봤었다.
  7. 이미 이전에 propagation을 활용하여 rumor detection 하는 논문들을 모두 봤었다.
  8. 이미 이전에 propagation을 활용하여 rumor detection 하는 논문들을 모두 봤었다.
  9. 이미 이전에 propagation을 활용하여 rumor detection 하는 논문들을 모두 봤었다.
  10. 이미 이전에 propagation을 활용하여 rumor detection 하는 논문들을 모두 봤었다.
  11. 이미 이전에 propagation을 활용하여 rumor detection 하는 논문들을 모두 봤었다.
  12. 이미 이전에 propagation을 활용하여 rumor detection 하는 논문들을 모두 봤었다.
  13. 이미 이전에 propagation을 활용하여 rumor detection 하는 논문들을 모두 봤었다.
  14. 이미 이전에 propagation을 활용하여 rumor detection 하는 논문들을 모두 봤었다.
  15. 이미 이전에 propagation을 활용하여 rumor detection 하는 논문들을 모두 봤었다.
  16. 이미 이전에 propagation을 활용하여 rumor detection 하는 논문들을 모두 봤었다.
  17. 이미 이전에 propagation을 활용하여 rumor detection 하는 논문들을 모두 봤었다.
  18. 이미 이전에 propagation을 활용하여 rumor detection 하는 논문들을 모두 봤었다.
  19. 이미 이전에 propagation을 활용하여 rumor detection 하는 논문들을 모두 봤었다.
  20. 이미 이전에 propagation을 활용하여 rumor detection 하는 논문들을 모두 봤었다.