Free and Effective: Making Flows Publicly Accessible, Yumi Ibrahimzade
NS-CUK Seminar: V.T.Hoang, Review on "Representation Learning On Heterogeneous Information Networks with Graph Transformer", WWW 2023
1. Van Thuy Hoang
Dept. of Artificial Intelligence,
The Catholic University of Korea
hoangvanthuy90@gmail.com
2. 2
Representation learning on HINs with Graph
Transformer
Capitalizes on a larger-range aggregation mechanism
for node representation learning
a local structure encoder and a heterogeneous relation
encoder
5. 5
Challenges
an efficient node-level Transformer encoder by virtue of the local-
view structure information
how to effectively capture the heterogeneous semantic relations
between nodes
6. 6
Heterogeneous Graph Neural Network
Meta-path free HGNNs get rid of dependence on handcraft meta-
paths, they employ messagepassing mechanism directly on the
original heterogeneous network with node/edge type-aware module,
so that the model can capture structural and semantic information
simultaneously
7. 7
THE PROPOSED MODEL: HINORMER
Local structure encoder
Heterogeneous relation encoder to capture the
feature-based local structural information and
heterogeneity-based semantic proximities for
each node, respectively.
The outputs of these two encoders are
employed as the node features and relative
positional encoding of the main part of the
heterogeneous Graph Transformer
8. 8
Node-level Heterogeneous Graph Transformer Architecture
As the features of different types of nodes on
HINs usually exist in different feature spaces, the
first priority is to map their features into a
shared feature space
Implement the local structure encoder in the
form of GNN-based neighborhood aggregation
9. 9
Heterogeneous Relation Encoder
This simple heterogeneous model further
improves the ability of HINormer to
capture heterogeneity, and the calculation
of relational encoding of node 𝑣 in
iteration step t
0
: one hot type embeddings
v
r
.; : transformation function
t
h
f
14. 14
CONCLUSION
Graph Transformer on heterogeneous information networks for node
representation learning.
capitalizes on a self-attention mechanism assisted by two key
components:
a local structure encoder: to capture both the structural
a heterogeneous relation encoder: to capture heterogeneous