4. Transformer Model
출처: Attention is All You Need(2018), Ashishi Vaswani, Noam Shazeer, Niki Parmar, Jakob
Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser
4
7. Attention vs Self-Attention
출처 : Show, Attend and Tell: Neural Image Caption Generation with Visual Attention (2015)
K. Xu , J. Ba, R. Kiros, K. Cho, A. Courville, R. Salakhutdinov, R. Zemel, Y. Bengio
7
10. Query, Key, Value…?
출처: Attention is All You Need(2018), Ashishi Vaswani, Noam Shazeer, Niki Parmar, Jakob
Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser
10
11. Query, Key, Value…?
I like flipped-learning which is very effective.
which
I
like
flipped-learning
which
is
very
effective
which는 어떤 단어가 얼마나 중요(attention)한가?
11
12. Query, Key, Value…?
I like flipped-learning which is very effective.
which
I
like
flipped-learning
which
is
very
effective
12
which는 flipped-learning에게 얼마나 중요(attention)한가?