Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
Seq2Seq learning
and recent advances
Vu Pham, 08.03.2017
Agenda
●
●
●
2
Recap: CNN for NLP
● →
●
●
●
3
Recurrent cell
4
LSTM cell
5
Recurrent neural nets
6
C
●
●
Avoid explicit segmentation of the output sequence
●
○
○
●
7
Seq2Seq: intro
●
○ →
●
○ →
●
○
8
Seq2Seq: idea
9
I go to the zoo <EOS>
Je vais au zoo <EOS>
C
Encoder Decoder
Sequence to Sequence Learning with Neural Net...
Seq2Seq: Training
10
C
I go to the zoo <EOS>
Je vais au zoo <EOS>
Je vais au zoo
→ ∅ ∅ ∅ ∅ ∅
→ ∅ ∅ ∅ ∅ ∅
Seq2Seq: Inference
11
I go to the zoo <EOS>
Je vais
Je
au
vais
zoo
au
<EOS>
zoo
Seq2Seq: Inference with (pruned) beam search
12
Je
Elle
Ils
... <EOS>
allé
...
au
...
...
...
vais
suis
...
...
...
...
.....
Attention mechanism
13
I go to the …... <EOS>
h0
h1
h2
h3
... hn
c
MLP
a0
a1
a2
a3
b0
b1
b2
b3
Je
Implemented in Tensorflo...
What we learned
●
●
●
14
Machine Translation
15
Image Captioning
16
VGG
14 x 14 x 512 196 x 512
Seq2Seq
196
sequences
Grammar parser
17
Grammar as a Foreign Language, NIPS 2014
Conversational bot
18
A Neural Conversational Model, ICML Workshop 2015
Skipthoughts
19
Skip-Thought Vectors, NIPS 2015
What else?
●
○
○
○
○
○
●
○
○
●
○
○
20
What we learned
●
●
●
○
○
○
○
21
Upcoming SlideShare
Loading in …5
×

Seq2 seq learning

Overview on Seq2Seq and recent related advances

  • Be the first to comment

  • Be the first to like this

Seq2 seq learning

  1. 1. Seq2Seq learning and recent advances Vu Pham, 08.03.2017
  2. 2. Agenda ● ● ● 2
  3. 3. Recap: CNN for NLP ● → ● ● ● 3
  4. 4. Recurrent cell 4
  5. 5. LSTM cell 5
  6. 6. Recurrent neural nets 6 C ● ●
  7. 7. Avoid explicit segmentation of the output sequence ● ○ ○ ● 7
  8. 8. Seq2Seq: intro ● ○ → ● ○ → ● ○ 8
  9. 9. Seq2Seq: idea 9 I go to the zoo <EOS> Je vais au zoo <EOS> C Encoder Decoder Sequence to Sequence Learning with Neural Networks
  10. 10. Seq2Seq: Training 10 C I go to the zoo <EOS> Je vais au zoo <EOS> Je vais au zoo → ∅ ∅ ∅ ∅ ∅ → ∅ ∅ ∅ ∅ ∅
  11. 11. Seq2Seq: Inference 11 I go to the zoo <EOS> Je vais Je au vais zoo au <EOS> zoo
  12. 12. Seq2Seq: Inference with (pruned) beam search 12 Je Elle Ils ... <EOS> allé ... au ... ... ... vais suis ... ... ... ... ... ... ...
  13. 13. Attention mechanism 13 I go to the …... <EOS> h0 h1 h2 h3 ... hn c MLP a0 a1 a2 a3 b0 b1 b2 b3 Je Implemented in Tensorflow seq2seq() Neural Machine Translation by Jointly Learning to Align and Translate Show, Attend and Tell: Neural Image Caption Generation with Visual Attention
  14. 14. What we learned ● ● ● 14
  15. 15. Machine Translation 15
  16. 16. Image Captioning 16 VGG 14 x 14 x 512 196 x 512 Seq2Seq 196 sequences
  17. 17. Grammar parser 17 Grammar as a Foreign Language, NIPS 2014
  18. 18. Conversational bot 18 A Neural Conversational Model, ICML Workshop 2015
  19. 19. Skipthoughts 19 Skip-Thought Vectors, NIPS 2015
  20. 20. What else? ● ○ ○ ○ ○ ○ ● ○ ○ ● ○ ○ 20
  21. 21. What we learned ● ● ● ○ ○ ○ ○ 21

    Be the first to comment

    Login to see the comments

Overview on Seq2Seq and recent related advances

Views

Total views

4,813

On Slideshare

0

From embeds

0

Number of embeds

2,363

Actions

Downloads

21

Shares

0

Comments

0

Likes

0

×