1) The document proposes a graph-based method using graph convolutional networks to address challenges in sequential recommendation, such as extracting implicit preferences from long behavior sequences and adapting to changing user preferences over time.
2) It constructs an interest graph from user behaviors and designs an attentive graph convolutional network and dynamic pooling technique to aggregate implicit signals into explicit preferences.
3) Experimental results on two large-scale datasets show the proposed method significantly outperforms state-of-the-art sequential recommendation methods.
1) The document proposes a graph-based method using graph convolutional networks to address challenges in sequential recommendation, such as extracting implicit preferences from long behavior sequences and adapting to changing user preferences over time.
2) It constructs an interest graph from user behaviors and designs an attentive graph convolutional network and dynamic pooling technique to aggregate implicit signals into explicit preferences.
3) Experimental results on two large-scale datasets show the proposed method significantly outperforms state-of-the-art sequential recommendation methods.
This document proposes the NGNN framework to improve the representation power of graph neural networks. NGNN extracts rooted subgraphs around each node and applies a base GNN independently to learn subgraph representations. These are then aggregated to obtain final node representations. The document outlines limitations of existing GNNs, describes the NGNN framework, and poses research questions about its theoretical power, performance improvements over base GNNs, results on benchmarks, and computational overhead. Key experiments are conducted on graph isomorphism, molecular property prediction, and node classification datasets to evaluate NGNN.
This document presents a novel Claim-guided Hierarchical Graph Attention Network (ClaHi-GAT) model for rumor detection using undirected interaction graphs. The model uses multi-level attention - post-level attention considers the content of individual tweets, while event-level attention compares tweets responding to the same claim. This allows the model to better capture features indicative of rumors. Experimental results on three Twitter datasets show the proposed model achieves superior performance for rumor classification and early detection compared to previous structure-based methods.
This document presents a novel graph neural network (GNN) convolutional layer based on Auto-Regressive Moving Average (ARMA) filters. The ARMA layer aims to address limitations of existing GNN layers that use polynomial filters by providing a more flexible frequency response with fewer parameters. It models graph signals using parallel stacks of recurrent operations to approximate high-order neighborhoods efficiently. Experimental results show the ARMA layer outperforms other GNN architectures on tasks like node classification, graph signal classification, and graph regression. Future work could explore incorporating text and content metadata into graph convolutional models.
The document proposes a novel graph transformer model called DeepGraph. DeepGraph uses substructure sampling to encode local graph information and add substructure tokens. It applies localized self-attention on substructures using a mask. The document experiments with DeepGraph on various graph datasets and analyzes its performance as its depth increases. Deeper models show diminishing returns, indicating a limitation of increasing depth in graph transformers.
This document summarizes a research paper on Graph Multiset Pooling, a new method for graph pooling using a Graph Multiset Transformer (GMT). The GMT treats graph pooling as a multiset encoding problem and uses multi-head attention to capture relationships among nodes. It satisfies the injectiveness and permutation invariance properties needed to be as powerful as the Weisfeiler-Lehman graph isomorphism test. Experimental results show the GMT outperforms other pooling methods on tasks like graph classification, reconstruction, and generation. The GMT provides a powerful and efficient way to learn meaningful representations of entire graphs.
This document summarizes a research paper that introduces Hyperbolic Graph Convolutional Networks (HGCNs) to address limitations of previous Euclidean graph neural networks. HGCNs map node features to hyperbolic spaces and use a novel attention-based aggregation scheme to capture hierarchical structure. The paper presents HGCNs, evaluates them on citation networks, disease propagation trees, protein networks and flight networks, and finds they outperform Euclidean baselines for link prediction and node classification by learning more interpretable hierarchical representations.
This document presents Graphormer, a Transformer-based model for graph representation learning. Graphormer achieves state-of-the-art performance on graph tasks by introducing three novel encodings: centrality encoding to capture node importance, spatial encoding to encode structural relations between nodes, and edge encoding to incorporate edge features. Experiments show Graphormer outperforms GNN baselines by over 10% on various graph datasets and tasks.
This document presents Graphormer, a Transformer-based model for graph representation learning. Graphormer achieves state-of-the-art performance on graph tasks by introducing three novel encodings: centrality encoding to capture node importance, spatial encoding to encode structural relations using shortest path distance, and edge encoding using edge features. Experiments show Graphormer outperforms GNN baselines by over 10% on various graph datasets and leaderboards.
The 2nd NS-CUK Weekly Seminar
Presenter: Sang Thanh Nguyen
Date: Mar 6th, 2023
Topic: Review on "DeeperGCN: All You Need to Train Deeper GCNs," arXiv
Schedule: https://nslab-cuk.github.io/seminar/
The 2nd NS-CUK Weekly Seminar
Presenter: Van Thuy Hoang
Date: Mar 6th, 2023
Topic: Review on "Everything is Connected: Graph Neural Networks," Current Opinion in Structural Biology
Schedule: https://nslab-cuk.github.io/seminar/
The 1st AI-CUK Weekly Joint Journal Club
Presenter: Van Thuy Hoang
Date: Jan 3rd, 2023
Topic: Review on "Global self-attention as a replacement for graph convolution," KDD 2022
Schedule: https://nslab-cuk.github.io/joint-journal-club/
This document proposes the NGNN framework to improve the representation power of graph neural networks. NGNN extracts rooted subgraphs around each node and applies a base GNN independently to learn subgraph representations. These are then aggregated to obtain final node representations. The document outlines limitations of existing GNNs, describes the NGNN framework, and poses research questions about its theoretical power, performance improvements over base GNNs, results on benchmarks, and computational overhead. Key experiments are conducted on graph isomorphism, molecular property prediction, and node classification datasets to evaluate NGNN.
This document presents a novel Claim-guided Hierarchical Graph Attention Network (ClaHi-GAT) model for rumor detection using undirected interaction graphs. The model uses multi-level attention - post-level attention considers the content of individual tweets, while event-level attention compares tweets responding to the same claim. This allows the model to better capture features indicative of rumors. Experimental results on three Twitter datasets show the proposed model achieves superior performance for rumor classification and early detection compared to previous structure-based methods.
This document presents a novel graph neural network (GNN) convolutional layer based on Auto-Regressive Moving Average (ARMA) filters. The ARMA layer aims to address limitations of existing GNN layers that use polynomial filters by providing a more flexible frequency response with fewer parameters. It models graph signals using parallel stacks of recurrent operations to approximate high-order neighborhoods efficiently. Experimental results show the ARMA layer outperforms other GNN architectures on tasks like node classification, graph signal classification, and graph regression. Future work could explore incorporating text and content metadata into graph convolutional models.
The document proposes a novel graph transformer model called DeepGraph. DeepGraph uses substructure sampling to encode local graph information and add substructure tokens. It applies localized self-attention on substructures using a mask. The document experiments with DeepGraph on various graph datasets and analyzes its performance as its depth increases. Deeper models show diminishing returns, indicating a limitation of increasing depth in graph transformers.
This document summarizes a research paper on Graph Multiset Pooling, a new method for graph pooling using a Graph Multiset Transformer (GMT). The GMT treats graph pooling as a multiset encoding problem and uses multi-head attention to capture relationships among nodes. It satisfies the injectiveness and permutation invariance properties needed to be as powerful as the Weisfeiler-Lehman graph isomorphism test. Experimental results show the GMT outperforms other pooling methods on tasks like graph classification, reconstruction, and generation. The GMT provides a powerful and efficient way to learn meaningful representations of entire graphs.
This document summarizes a research paper that introduces Hyperbolic Graph Convolutional Networks (HGCNs) to address limitations of previous Euclidean graph neural networks. HGCNs map node features to hyperbolic spaces and use a novel attention-based aggregation scheme to capture hierarchical structure. The paper presents HGCNs, evaluates them on citation networks, disease propagation trees, protein networks and flight networks, and finds they outperform Euclidean baselines for link prediction and node classification by learning more interpretable hierarchical representations.
This document presents Graphormer, a Transformer-based model for graph representation learning. Graphormer achieves state-of-the-art performance on graph tasks by introducing three novel encodings: centrality encoding to capture node importance, spatial encoding to encode structural relations between nodes, and edge encoding to incorporate edge features. Experiments show Graphormer outperforms GNN baselines by over 10% on various graph datasets and tasks.
This document presents Graphormer, a Transformer-based model for graph representation learning. Graphormer achieves state-of-the-art performance on graph tasks by introducing three novel encodings: centrality encoding to capture node importance, spatial encoding to encode structural relations using shortest path distance, and edge encoding using edge features. Experiments show Graphormer outperforms GNN baselines by over 10% on various graph datasets and leaderboards.
The 2nd NS-CUK Weekly Seminar
Presenter: Sang Thanh Nguyen
Date: Mar 6th, 2023
Topic: Review on "DeeperGCN: All You Need to Train Deeper GCNs," arXiv
Schedule: https://nslab-cuk.github.io/seminar/
The 2nd NS-CUK Weekly Seminar
Presenter: Van Thuy Hoang
Date: Mar 6th, 2023
Topic: Review on "Everything is Connected: Graph Neural Networks," Current Opinion in Structural Biology
Schedule: https://nslab-cuk.github.io/seminar/
The 1st AI-CUK Weekly Joint Journal Club
Presenter: Van Thuy Hoang
Date: Jan 3rd, 2023
Topic: Review on "Global self-attention as a replacement for graph convolution," KDD 2022
Schedule: https://nslab-cuk.github.io/joint-journal-club/
3. 2
Introduction
에너지 재생 트렌드
(출처 : International Renewable Energy Agency)
Offshore wind : 육상 풍력
Onshore wind : 해상 풍력
풍력 에너지 생산량 증가 매년 증가
4. 3
Introduction
연간 재생 에너지 분야의 지출 계획
(출처 : International Renewable Energy Agency)
➔ 태양 에너지 다음으로 가장 큰 투자를 받고 있는 에너지원
5. 4
Introduction
(출처 : International Renewable Energy Agency)
국가별 새로 의뢰된 육상 풍력 프로젝트의 가중 평균 LCOE
• LCOE (균등화발전원가) : 발전소가 1kWh의 전기를 생산하기 위한 비용
모든 국가에서 풍력 발전에 대한 비용 감소 중
➔ 풍력에 대한 개발 진행 중
6. 5
Introduction
(출처 : International Renewable Energy Agency)
Problem Statement
• 최근 몇 년 동안 에너지 부족현상 증가
➔ 청정 재생 에너지 원인 풍력에너지 주목
• 따라서 풍력 발전의 문제인 불안정한 풍력 에너지의 수급을 해결하기 위한 정확한 풍속과 풍향의 예측이 필요함
7. 6
Introduction
Spatio-temporal wind speed forecasting using graph networks and novel Transformer architectures, Applied Energy 2022
Problem Statement
그래프로 공간적 정보를 주지만 한계가 존재함
8. 7
Introduction
Problem Statement
• 노드끼리의 연결성은 표현은 가능하지만 노드 사이의 구체적인 정보는 연결 만으로는 표현 불가능
(출처 : Adaptive Spatio-temporal Graph Neural Network for traffic forecasting, Knowledge-Based Systems)
그래프로 표현
연결 만으로는 노드 간의 정보를 제공하기에는 한계 존재
9. 8
Introduction
Problem Statement
• 최근 몇 년 동안 에너지 부족현상 증가
➔ 청정 재생 에너지 원인 풍력에너지 주목
• 불안정한 풍력 에너지의 수급을 해결하기 위한 정확한 풍속과 풍향의 예측이 필요함
• 높은 정확도를 위해서 구체적인 공간 정보를 그래프에 반영해야 함
10. 9
Introduction
Contributions
• 본 연구는 시계열적 특징을 가지는 다양한 기상 변수와 관측소 간 지형정보를 반영할 수 있는 그래프 신경망 제안함
1. 지형 및 지리적 환경에 대한 이해를 더욱 강화할 수 있음
2. 지형정보를 통해 지역 특징을 보다 더 고려할 수 있음
3. 지형정보를 통해 관측소 배치에 대한 정보를 반영할 수 있음
➔ 관측소가 지형적으로 위치가 비슷한 경우 서로 상호작용이 강해지므로, 관측소 네트워크를 최적화 하는데 유용함
15. 14
Experiment Plan
Baselines
• ST-GCN (Sptial-temporal GNN) 등과 같은 SOTA (State-Of-The-Art) 모델들과 비교할 계획
Performance Evaluation
• MAE (Mean Absolute Error)
• RMSE (Root Mean Square Error)
• MAPE (Mean Absolute Percentage Error)
• SMAPE (Symmetric Mean Absolute Percentage Error)
• R2 (Coefficient of determination)
Sensitivity Experiment
• 지형 정보 포함 유무에 대한 분석
• 해상도에 대한 영향 분석 (5km ~ 90m)
• 관측소 간 거리에 대한 분석 (20km ~ 40km, 5km 간
격)
16. 15
Conclusion
• 본 연구는 풍속과 풍향의 정확한 예측을 위해 지형 정보를 반영한 다중모달 시공간적 그래프신경망 제안함
• 지형 고도 자료와 지면 피복 자료의 반영이 기존 SOTA 모델들 대비 예측 향상도를 평가할 예정
• 추후 민감도 실험을 수행하여 제안한 모델의 성능을 상세히 분석하고 평가할 예정