HY

Hayahide Yamagishi

Sort by
[PACLING2019] Improving Context-aware Neural Machine Translation with Target-side Context
[修論発表会資料] 目的言語の文書文脈を用いたニューラル機械翻訳
[論文読み会資料] Beyond Error Propagation in Neural Machine Translation: Characteristics of Language Also Matter
[ACL2018読み会資料] Sharp Nearby, Fuzzy Far Away: How Neural Language Models Use Context
[NAACL2018読み会] Deep Communicating Agents for Abstractive Summarization
[論文読み会資料] Asynchronous Bidirectional Decoding for Neural Machine Translation
[ML論文読み会資料] Teaching Machines to Read and Comprehend
[EMNLP2017読み会] Efficient Attention using a Fixed-Size Memory Representation
[ML論文読み会資料] Training RNNs as Fast as CNNs
入力文への情報の付加によるNMTの出力文の変化についてのエラー分析
[ACL2017読み会] What do Neural Machine Translation Models Learn about Morphology?
Why neural translations are the right length
A hierarchical neural autoencoder for paragraphs and documents
ニューラル論文を読む前に
ニューラル日英翻訳における出力文の態制御