[DL輪読会]Set Transformer: A Framework for Attention-based Permutation-Invariant Nerural Network
1.
1DEEP LEARNING JP
[DLPapers]
http://deeplearning.jp/
Takumi Ohkuma, Nakayama Lab M1
Set Transformer: A Framework for Attention-
based Permutation-Invariant Neural Network
Set Transformerの全体構造
SAB,ISAB, PAMを単層もしくは多層に重ねてSet Transformer全体を構築する。
SAB or ISAB
SAB or ISAB
𝑃𝑀𝐴1
FC
SAB or ISAB
SAB or ISAB
𝑃𝑀𝐴4
FC
SAB or ISAB
SAB or ISAB
SAB or ISAB
例1 例2
参考文献
[1] Vaswani, Ashish,et al. "Attention is all you need." Advances in neural information
processing systems. 2017.
[2] Zaheer, Manzil, et al. "Deep sets." Advances in neural information processing
systems. 2017.
[3] Yang, Bo, et al. "Robust attentional aggregation of deep feature sets for multi-view
3D reconstruction." International Journal of Computer Vision 128.1 (2020): 53-73.
[4] Lake, B. M., et al. “Human-level concept learning through probabilistic program
induction.” Science, 350(6266):1332–1338, 2015
[5] Karen Simonyan et al. “Very deep convolutional networks for large-scale image
recognition” In proc of ICLR 2014
[6] Chang, A. X., et al “An information-rich 3D model repository.”, arXiv:1512.03012,
2015.