Download free for 30 days
Sign in
Upload
Language (EN)
Support
Business
Mobile
Social Media
Marketing
Technology
Art & Photos
Career
Design
Education
Presentations & Public Speaking
Government & Nonprofit
Healthcare
Internet
Law
Leadership & Management
Automotive
Engineering
Software
Recruiting & HR
Retail
Sales
Services
Science
Small Business & Entrepreneurship
Food
Environment
Economy & Finance
Data & Analytics
Investor Relations
Sports
Spiritual
News & Politics
Travel
Self Improvement
Real Estate
Entertainment & Humor
Health & Medicine
Devices & Hardware
Lifestyle
Change Language
Language
English
Español
Português
Français
Deutsche
Cancel
Save
EN
Uploaded by
sekizawayuuki
259 views
読解支援@2015 06-26
8
Education
◦
Read more
0
Save
Share
Embed
Embed presentation
Download
Download to read offline
1
/ 11
2
/ 11
3
/ 11
4
/ 11
5
/ 11
6
/ 11
7
/ 11
8
/ 11
9
/ 11
10
/ 11
11
/ 11
More Related Content
PDF
[DL輪読会]Imagination-Augmented Agents for Deep Reinforcement Learning / Learnin...
by
Deep Learning JP
PPT
ジェネリクスの概論とか
by
nagise
PDF
【DL輪読会】Where do Models go Wrong? Parameter-Space Saliency Maps for Explainabi...
by
Deep Learning JP
PDF
[DL輪読会]Adversarial Feature Matching for Text Generation
by
Deep Learning JP
PPTX
Nexus network connecting the preceding and the following in dialogue generation
by
OgataTomoya
PPTX
Diet networks thin parameters for fat genomic
by
Hakky St
PDF
[DL輪読会] Towards an Automatic Turing Test: Learning to Evaluate Dialogue Respo...
by
Deep Learning JP
PDF
ICASSP2017読み会 (Deep Learning III) [電通大 中鹿先生]
by
Shinnosuke Takamichi
[DL輪読会]Imagination-Augmented Agents for Deep Reinforcement Learning / Learnin...
by
Deep Learning JP
ジェネリクスの概論とか
by
nagise
【DL輪読会】Where do Models go Wrong? Parameter-Space Saliency Maps for Explainabi...
by
Deep Learning JP
[DL輪読会]Adversarial Feature Matching for Text Generation
by
Deep Learning JP
Nexus network connecting the preceding and the following in dialogue generation
by
OgataTomoya
Diet networks thin parameters for fat genomic
by
Hakky St
[DL輪読会] Towards an Automatic Turing Test: Learning to Evaluate Dialogue Respo...
by
Deep Learning JP
ICASSP2017読み会 (Deep Learning III) [電通大 中鹿先生]
by
Shinnosuke Takamichi
Viewers also liked
PPTX
Step by Stepで学ぶ自然言語処理における深層学習の勘所
by
Ogushi Masaya
PDF
深層意味表現学習 (Deep Semantic Representations)
by
Danushka Bollegala
PDF
単語・句の分散表現の学習
by
Naoaki Okazaki
PDF
深層ニューラルネットワークによる知識の自動獲得・推論
by
Naoaki Okazaki
PDF
ニューラルネットワークの数理
by
Task Ohmori
PDF
「深層学習」第6章 畳込みニューラルネット
by
Ken'ichi Matsui
PDF
自然言語処理のためのDeep Learning
by
Yuta Kikuchi
PDF
DS LT祭り 「AUCが0.01改善したって どういうことですか?」
by
Ken'ichi Matsui
PDF
深層学習時代の自然言語処理
by
Yuya Unno
Step by Stepで学ぶ自然言語処理における深層学習の勘所
by
Ogushi Masaya
深層意味表現学習 (Deep Semantic Representations)
by
Danushka Bollegala
単語・句の分散表現の学習
by
Naoaki Okazaki
深層ニューラルネットワークによる知識の自動獲得・推論
by
Naoaki Okazaki
ニューラルネットワークの数理
by
Task Ohmori
「深層学習」第6章 畳込みニューラルネット
by
Ken'ichi Matsui
自然言語処理のためのDeep Learning
by
Yuta Kikuchi
DS LT祭り 「AUCが0.01改善したって どういうことですか?」
by
Ken'ichi Matsui
深層学習時代の自然言語処理
by
Yuya Unno
Similar to 読解支援@2015 06-26
PPTX
Efficient estimation of word representations in vector space
by
tetsuo ishigaki
PDF
Word2vec alpha
by
KCS Keio Computer Society
PDF
TensorFlow math ja 05 word2vec
by
Shin Asakawa
PDF
Emnlp読み会@2015 10-09
by
sekizawayuuki
PDF
2016word embbed
by
Shin Asakawa
PDF
言語と画像の表現学習
by
Yuki Noguchi
PDF
4thNLPDL
by
Sho Takase
PDF
DeepLearning論文紹介@Ace12358
by
Ace12358
PDF
Semantic_Matching_AAAI16_論文紹介
by
Masayoshi Kondo
PPTX
Nl237 presentation
by
Roy Ray
PPTX
【論文紹介】Distributed Representations of Sentences and Documents
by
Tomofumi Yoshida
PPTX
Paper: seq2seq 20190320
by
Yusuke Fujimoto
PPTX
EMNLP 2015 読み会 @ 小町研 "Morphological Analysis for Unsegmented Languages using ...
by
Yuki Tomo
PDF
Deep Learningと自然言語処理
by
Preferred Networks
PDF
ニューラルネットワークを用いた自然言語処理
by
Sho Takase
PDF
Extract and edit
by
禎晃 山崎
PDF
Memory Networks (End-to-End Memory Networks の Chainer 実装)
by
Shuyo Nakatani
PPTX
dont_count_predict_in_acl2014
by
Sho Takase
PDF
読解支援@2015 07-03
by
sekizawayuuki
PDF
【文献紹介】Abstractive Text Summarization Using Sequence-to-Sequence RNNs and Beyond
by
Takashi YAMAMURA
Efficient estimation of word representations in vector space
by
tetsuo ishigaki
Word2vec alpha
by
KCS Keio Computer Society
TensorFlow math ja 05 word2vec
by
Shin Asakawa
Emnlp読み会@2015 10-09
by
sekizawayuuki
2016word embbed
by
Shin Asakawa
言語と画像の表現学習
by
Yuki Noguchi
4thNLPDL
by
Sho Takase
DeepLearning論文紹介@Ace12358
by
Ace12358
Semantic_Matching_AAAI16_論文紹介
by
Masayoshi Kondo
Nl237 presentation
by
Roy Ray
【論文紹介】Distributed Representations of Sentences and Documents
by
Tomofumi Yoshida
Paper: seq2seq 20190320
by
Yusuke Fujimoto
EMNLP 2015 読み会 @ 小町研 "Morphological Analysis for Unsegmented Languages using ...
by
Yuki Tomo
Deep Learningと自然言語処理
by
Preferred Networks
ニューラルネットワークを用いた自然言語処理
by
Sho Takase
Extract and edit
by
禎晃 山崎
Memory Networks (End-to-End Memory Networks の Chainer 実装)
by
Shuyo Nakatani
dont_count_predict_in_acl2014
by
Sho Takase
読解支援@2015 07-03
by
sekizawayuuki
【文献紹介】Abstractive Text Summarization Using Sequence-to-Sequence RNNs and Beyond
by
Takashi YAMAMURA
More from sekizawayuuki
PDF
Translating phrases in neural machine translation
by
sekizawayuuki
PDF
Improving lexical choice in neural machine translation
by
sekizawayuuki
PDF
Improving Japanese-to-English Neural Machine Translation by Paraphrasing the ...
by
sekizawayuuki
PDF
Incorporating word reordering knowledge into attention-based neural machine t...
by
sekizawayuuki
PDF
paper introducing: Exploiting source side monolingual data in neural machine ...
by
sekizawayuuki
PDF
Coling2016 pre-translation for neural machine translation
by
sekizawayuuki
PPTX
目的言語の低頻度語の高頻度語への言い換えによるニューラル機械翻訳の改善
by
sekizawayuuki
PPTX
Emnlp読み会@2017 02-15
by
sekizawayuuki
PDF
Acl reading@2016 10-26
by
sekizawayuuki
PDF
[論文紹介]Selecting syntactic, non redundant segments in active learning for mach...
by
sekizawayuuki
PDF
Nlp2016 sekizawa
by
sekizawayuuki
PDF
Acl読み会@2015 09-18
by
sekizawayuuki
PDF
読解支援@2015 08-10-6
by
sekizawayuuki
PDF
読解支援@2015 08-10-5
by
sekizawayuuki
PDF
読解支援@2015 08-10-4
by
sekizawayuuki
PDF
読解支援@2015 08-10-3
by
sekizawayuuki
PDF
読解支援@2015 08-10-2
by
sekizawayuuki
PDF
読解支援@2015 08-10-1
by
sekizawayuuki
PDF
読解支援@2015 07-24
by
sekizawayuuki
PDF
読解支援@2015 07-17
by
sekizawayuuki
Translating phrases in neural machine translation
by
sekizawayuuki
Improving lexical choice in neural machine translation
by
sekizawayuuki
Improving Japanese-to-English Neural Machine Translation by Paraphrasing the ...
by
sekizawayuuki
Incorporating word reordering knowledge into attention-based neural machine t...
by
sekizawayuuki
paper introducing: Exploiting source side monolingual data in neural machine ...
by
sekizawayuuki
Coling2016 pre-translation for neural machine translation
by
sekizawayuuki
目的言語の低頻度語の高頻度語への言い換えによるニューラル機械翻訳の改善
by
sekizawayuuki
Emnlp読み会@2017 02-15
by
sekizawayuuki
Acl reading@2016 10-26
by
sekizawayuuki
[論文紹介]Selecting syntactic, non redundant segments in active learning for mach...
by
sekizawayuuki
Nlp2016 sekizawa
by
sekizawayuuki
Acl読み会@2015 09-18
by
sekizawayuuki
読解支援@2015 08-10-6
by
sekizawayuuki
読解支援@2015 08-10-5
by
sekizawayuuki
読解支援@2015 08-10-4
by
sekizawayuuki
読解支援@2015 08-10-3
by
sekizawayuuki
読解支援@2015 08-10-2
by
sekizawayuuki
読解支援@2015 08-10-1
by
sekizawayuuki
読解支援@2015 07-24
by
sekizawayuuki
読解支援@2015 07-17
by
sekizawayuuki
読解支援@2015 06-26
1.
Efficient Es*ma*on of
Word Representa*ons in Vector Space Tomas Mikolov, Kai Chen, Greg Corrado, Jeffrey Dean In Proceedings of Interna*onal Conference on Learning Representa*ons, 2013 プレゼンテーション 関沢祐樹 2015/06/26 1
2.
本研究が行ったこと • 提案するモデル – 非常に大きなデータからなる単語ベクトル表現
– 正解率、計算コストで比較 • 16億単語のデータセットから1日経たずに学習 – シンタクス的実験と、意味的実験を行った – ニューラルネットワークよりも良い成果 2015/06/26 2
3.
ニューラルネットワークモデル • neural network
language model (NNLM) – 過去の単語から予測 (指定する必要あり) – 入力層、予測層、隠れ層、出力層の4層 – 計算がかなり複雑 • Recurrent Neural Net Language Model (RNNLM) – 予測層がない、 過去の文脈を見る – NNLMよりも、計算が少し簡単 • Parallel Training of Neural Networks – 集中型サーバで基盤を複製して、並列処理をする 2015/06/26 3
4.
新たな対数線形モデル • ConBnuous Bag-‐of-‐Words
Model (CBOW) – NNLMに似ている – 前後4単語ずつを用いると最も良い成果と判明 • ConBnuous Skip-‐gram Model – 遠い単語ほど、関連が少ない – 遠いほど、重みを小さくする – とる単語の範囲は最大10として実験 2015/06/26 4
5.
今回のタスク • テストセットを定義して、使用 –
5種類の意味的問題 問題数:8869 – 9種類のシンタクス的問題 問題数:10675 – 手動で類似単語ペアのリストを作成 – リストからランダムに2つの単語ペアをとって作成 – 2語以上からなるものは対象外 (例:New York) 2015/06/26 5
6.
タスクの評価手法 • 全ての問題に対する正解率を使用 – 意味的問題とシンタクス的問題は切り離す
• それぞれに対する正解率を算出 – 問題の答えと、各手法が導いた最も近いとする 単語ベクトルが同じであるとき、正解とする 2015/06/26 6
7.
訓練データ単語数とベクトルの次元数の関係 • CBOWを用いた実験:正解率の変化の視覚化 – 問題:Google
Newsで頻度の高い3万語のみを含む – どちらも多いほど、正解率向上 – 次元が50→100のとき、正解率が大きく変化 – 片方だけを大きくすると、悪くなることがある 2015/06/26 7
8.
モデルの比較1 • 訓練データ・・・単語数:3.2億、語彙数:8,200 •
ベクトル次元数:640 • 上記2つの条件を固定して実験 2015/06/26 8
9.
モデルの比較2 2015/06/26 9
10.
モデルの比較3 • CPUの数は推定 – 他のタスクの影響で若干の変動がある
• NNLMのベクトル次元数は100 – 1000だと、時間がかかりすぎるため • 100でも、かなり時間がかかっている 2015/06/26 10
11.
まとめ • 様々なモデルの単語ベクトル表現の質を見た •
ニューラルネットワークよりも単純なモデル – CBOWとSkip-‐gramを使用 – 質の良い単語ベクトルを生成できた – 計算がより複雑でない • 計算時間が早い • 次元を大きくできる 2015/06/26 11
Download