SlideShare a Scribd company logo
Neural Network-Based
Abstract Generation for
Opinions and Arguments
Lu Wang and Wang Ling
NAACL 2016
論文紹介
Presentation:Tomonori Kodaira
1
Abstract
• 文書to文のAbstractive要約
• 文書をエンコードする際,Samplingによって計算
量の問題を解決

その際,text unitの重要度を基準にTop-Kのデータ
を用いる
2
Introduction
• Authors present an attention-based NN model for
generating abstractive summaries of opinionated text.
• Their system takes as input a set of text units, and
then outputs a one-sentence abstractive summary.
• Two type of opinionated text:

Movie reviews

Arguments on controversial topic
3
Introduction
• Systems

Attention-based model (Bahdanau et al. 2014)

An importance-based sampling method.
• The importance score of a text unit is estimated
from a regression model with pairwise preference-
based sampling.
4
Data Collection
Rotten Tomatoes (www.rottentomatoes.com)
• There are professional critics and user-generated reviews.
• For each movie has a one-sentence critic consensus.
Data
• 246,164 critics and their opinion consensus for 3,731 movies
• train: 2,458, validation: 536, test: 737 movies
Movie Reviews
5
Data Collection
idebate.org
• This site is wikipedia-style website for gathering pro
and con arguments on controversial issues.
• Each point contains a one-sentence central claim.
Data
• 676 debates with 2,259 claims.
• train: 450, validation: 67, test: 150 debates
Arguments on controversial topic
6
Text units
Data Collection
7
The Neural Network-Based Abstract
Generation Model
• summary y (composed by the sequence of words
y1 , …, |y|.
• input consists of an arbitrary number of reviews or
arguments -> text units

x = {x1, … , xM}
• Each text unit xk is composed by a sequence of words
xk
1 , …., xk
|x
k
|.
Problem Formulation
8
The Neural Network-Based Abstract
Generation Model
• a sequence of word-level predictions: 

log P(y|x) = ∑j=1 log P(yj| y1, ….yj-1, x)

P(yi | y1 , …, yj-1, x) = sofmax(hj)
• hj is RNNs state variable.

hj = g(yj-1, hj-1, s)
• g is LSTM network (Hochreiter and Schmidhuber, 1997)
Decoder
9
The Neural Network-Based Abstract
Generation Model
• LSTM
• The model concatenates the representation of previous
output word yj-1 and the input representation s as uj
Decoder
10
The Neural Network-Based Abstract
Generation Model
• The representation of input text units s is computed
using an attention model (Bahdanau et al., 2014)

-> ∑i=1aibi
• Authors construct bi by building a bidirectional LSTM.

They use the LSTM formulation by setting uj = xj.
• ai = softmax(v(bi, hj-1)) 

v(bi, hj-1) = Ws•tanh(Wcgbi + Whghj-1)
Encoder
11
The Neural Network-Based Abstract
Generation Model
Their input consists multiple separate text units.
• one sequence z =
There two problem:
• The model is sensitive to the order of text units
• z may contain thousands of words.
Attention Over Multiple Inputs
12
The Neural Network-Based Abstract
Generation Model
Sub-sampling from the input
• They define an importance score f (xk) ∈ [0, 1] for
each document xk.
• K candidates are sampled
Attention Over Multiple Inputs
13
The Neural Network-Based Abstract
Generation Model
a ridge regression model and a regularizer.
• Learning f(xk) = rk•w 

by minimizing ||Rw - L ||2
2 + λ•||R’w-L’||2
2 + β•||w||2
2.
• text unit xk is represented as an d-dimentional
feature vector rk ∈ Rd.
Importance Estimation
14
The Neural Network-Based Abstract
Generation Model
• For testing phase, they re-rank the n-best summaries
according to their cosine similarity with the input text units.
• The one with the highest similarity is included in the
final summary.
Post-processing
15
Experimental Setup
• Data Preprocessing

Stanford CoreNLP (Manning et al., 2014)
• Pre trained Embeddings and Features

word embedding: 300 dimension

They extend their model with additional features.
16
• Hyper parameters 

The LSTMs are defined with states and cells of 150
dimensions.

The attention: 100 dimensions. 

Training is performed via Adagrad (Duchi et al. 2011)
• Evaluation : BLEU
• The importance-based sampling rate K is set of 5
• Decoding: beam serch -> 20
Experimental Setup
17
Results
• MRR (Mean Reciprocal Rank
• NDCG (normalized Discounted Cumulative Gain)
Importance Estimation Evaluation
18
Results
Importance Estimation Evaluation
19
Results
Human Evaluation on Summary Quality
20
Results
Sampling Effect
21
Conclusion
• Authors presented a neural approach to generate
abstractive summaries for opinionated text.
• They employed an attention-based method that
finds salient information from different input text
units.
• They deploy an importance-based sampling
mechanism for model training.
• Their system obtained sota results.
22

More Related Content

What's hot

Relational knowledge distillation
Relational knowledge distillationRelational knowledge distillation
Relational knowledge distillation
NAVER Engineering
 
Learning Communication with Neural Networks
Learning Communication with Neural NetworksLearning Communication with Neural Networks
Learning Communication with Neural Networks
hytae
 
Introduction to Tree-LSTMs
Introduction to Tree-LSTMsIntroduction to Tree-LSTMs
Introduction to Tree-LSTMs
Daniel Perez
 
Recurrent Neural Networks, LSTM and GRU
Recurrent Neural Networks, LSTM and GRURecurrent Neural Networks, LSTM and GRU
Recurrent Neural Networks, LSTM and GRU
ananth
 
Neural network basic and introduction of Deep learning
Neural network basic and introduction of Deep learningNeural network basic and introduction of Deep learning
Neural network basic and introduction of Deep learning
Tapas Majumdar
 
Deep Learning in Recommender Systems - RecSys Summer School 2017
Deep Learning in Recommender Systems - RecSys Summer School 2017Deep Learning in Recommender Systems - RecSys Summer School 2017
Deep Learning in Recommender Systems - RecSys Summer School 2017
Balázs Hidasi
 
Recent Progress in RNN and NLP
Recent Progress in RNN and NLPRecent Progress in RNN and NLP
Recent Progress in RNN and NLP
hytae
 
Attentive Relational Networks for Mapping Images to Scene Graphs
Attentive Relational Networks for Mapping Images to Scene GraphsAttentive Relational Networks for Mapping Images to Scene Graphs
Attentive Relational Networks for Mapping Images to Scene Graphs
Sangmin Woo
 
On Sampling Strategies for Sampling Strategies-based Collaborative Filtering
On Sampling Strategies for Sampling Strategies-based Collaborative FilteringOn Sampling Strategies for Sampling Strategies-based Collaborative Filtering
On Sampling Strategies for Sampling Strategies-based Collaborative Filtering
Ting Chen
 
Deep recurrent generative decoder for abstractive text summarization
Deep recurrent generative decoder for abstractive text summarizationDeep recurrent generative decoder for abstractive text summarization
Deep recurrent generative decoder for abstractive text summarization
Kodaira Tomonori
 
is anyone_interest_in_auto-encoding_variational-bayes
is anyone_interest_in_auto-encoding_variational-bayesis anyone_interest_in_auto-encoding_variational-bayes
is anyone_interest_in_auto-encoding_variational-bayes
NAVER Engineering
 
Deep Learning for Recommender Systems RecSys2017 Tutorial
Deep Learning for Recommender Systems RecSys2017 Tutorial Deep Learning for Recommender Systems RecSys2017 Tutorial
Deep Learning for Recommender Systems RecSys2017 Tutorial
Alexandros Karatzoglou
 
Icon18revrec sudeshna
Icon18revrec sudeshnaIcon18revrec sudeshna
Icon18revrec sudeshna
Muthusamy Chelliah
 
Dual Learning for Machine Translation (NIPS 2016)
Dual Learning for Machine Translation (NIPS 2016)Dual Learning for Machine Translation (NIPS 2016)
Dual Learning for Machine Translation (NIPS 2016)
Toru Fujino
 
Overview of TensorFlow For Natural Language Processing
Overview of TensorFlow For Natural Language ProcessingOverview of TensorFlow For Natural Language Processing
Overview of TensorFlow For Natural Language Processing
ananth
 
Recursive Neural Networks
Recursive Neural NetworksRecursive Neural Networks
Recursive Neural Networks
Sangwoo Mo
 
Selective encoding for abstractive sentence summarization
Selective encoding for abstractive sentence summarizationSelective encoding for abstractive sentence summarization
Selective encoding for abstractive sentence summarization
Kodaira Tomonori
 
Improving neural question generation using answer separation
Improving neural question generation using answer separationImproving neural question generation using answer separation
Improving neural question generation using answer separation
NAVER Engineering
 
Domain Transfer and Adaptation Survey
Domain Transfer and Adaptation SurveyDomain Transfer and Adaptation Survey
Domain Transfer and Adaptation Survey
Sangwoo Mo
 
The Munich LSTM-RNN Approach to the MediaEval 2014 “Emotion in Music” Task
The Munich LSTM-RNN Approach to the MediaEval 2014 “Emotion in Music” TaskThe Munich LSTM-RNN Approach to the MediaEval 2014 “Emotion in Music” Task
The Munich LSTM-RNN Approach to the MediaEval 2014 “Emotion in Music” Task
multimediaeval
 

What's hot (20)

Relational knowledge distillation
Relational knowledge distillationRelational knowledge distillation
Relational knowledge distillation
 
Learning Communication with Neural Networks
Learning Communication with Neural NetworksLearning Communication with Neural Networks
Learning Communication with Neural Networks
 
Introduction to Tree-LSTMs
Introduction to Tree-LSTMsIntroduction to Tree-LSTMs
Introduction to Tree-LSTMs
 
Recurrent Neural Networks, LSTM and GRU
Recurrent Neural Networks, LSTM and GRURecurrent Neural Networks, LSTM and GRU
Recurrent Neural Networks, LSTM and GRU
 
Neural network basic and introduction of Deep learning
Neural network basic and introduction of Deep learningNeural network basic and introduction of Deep learning
Neural network basic and introduction of Deep learning
 
Deep Learning in Recommender Systems - RecSys Summer School 2017
Deep Learning in Recommender Systems - RecSys Summer School 2017Deep Learning in Recommender Systems - RecSys Summer School 2017
Deep Learning in Recommender Systems - RecSys Summer School 2017
 
Recent Progress in RNN and NLP
Recent Progress in RNN and NLPRecent Progress in RNN and NLP
Recent Progress in RNN and NLP
 
Attentive Relational Networks for Mapping Images to Scene Graphs
Attentive Relational Networks for Mapping Images to Scene GraphsAttentive Relational Networks for Mapping Images to Scene Graphs
Attentive Relational Networks for Mapping Images to Scene Graphs
 
On Sampling Strategies for Sampling Strategies-based Collaborative Filtering
On Sampling Strategies for Sampling Strategies-based Collaborative FilteringOn Sampling Strategies for Sampling Strategies-based Collaborative Filtering
On Sampling Strategies for Sampling Strategies-based Collaborative Filtering
 
Deep recurrent generative decoder for abstractive text summarization
Deep recurrent generative decoder for abstractive text summarizationDeep recurrent generative decoder for abstractive text summarization
Deep recurrent generative decoder for abstractive text summarization
 
is anyone_interest_in_auto-encoding_variational-bayes
is anyone_interest_in_auto-encoding_variational-bayesis anyone_interest_in_auto-encoding_variational-bayes
is anyone_interest_in_auto-encoding_variational-bayes
 
Deep Learning for Recommender Systems RecSys2017 Tutorial
Deep Learning for Recommender Systems RecSys2017 Tutorial Deep Learning for Recommender Systems RecSys2017 Tutorial
Deep Learning for Recommender Systems RecSys2017 Tutorial
 
Icon18revrec sudeshna
Icon18revrec sudeshnaIcon18revrec sudeshna
Icon18revrec sudeshna
 
Dual Learning for Machine Translation (NIPS 2016)
Dual Learning for Machine Translation (NIPS 2016)Dual Learning for Machine Translation (NIPS 2016)
Dual Learning for Machine Translation (NIPS 2016)
 
Overview of TensorFlow For Natural Language Processing
Overview of TensorFlow For Natural Language ProcessingOverview of TensorFlow For Natural Language Processing
Overview of TensorFlow For Natural Language Processing
 
Recursive Neural Networks
Recursive Neural NetworksRecursive Neural Networks
Recursive Neural Networks
 
Selective encoding for abstractive sentence summarization
Selective encoding for abstractive sentence summarizationSelective encoding for abstractive sentence summarization
Selective encoding for abstractive sentence summarization
 
Improving neural question generation using answer separation
Improving neural question generation using answer separationImproving neural question generation using answer separation
Improving neural question generation using answer separation
 
Domain Transfer and Adaptation Survey
Domain Transfer and Adaptation SurveyDomain Transfer and Adaptation Survey
Domain Transfer and Adaptation Survey
 
The Munich LSTM-RNN Approach to the MediaEval 2014 “Emotion in Music” Task
The Munich LSTM-RNN Approach to the MediaEval 2014 “Emotion in Music” TaskThe Munich LSTM-RNN Approach to the MediaEval 2014 “Emotion in Music” Task
The Munich LSTM-RNN Approach to the MediaEval 2014 “Emotion in Music” Task
 

Similar to [Introduction] Neural Network-Based Abstract Generation for Opinions and Arguments

Neural Summarization by Extracting Sentences and Words
Neural Summarization by Extracting Sentences and WordsNeural Summarization by Extracting Sentences and Words
Neural Summarization by Extracting Sentences and Words
Kodaira Tomonori
 
Reference Scope Identification of Citances Using Convolutional Neural Network
Reference Scope Identification of Citances Using Convolutional Neural NetworkReference Scope Identification of Citances Using Convolutional Neural Network
Reference Scope Identification of Citances Using Convolutional Neural Network
Saurav Jha
 
SummaRuNNer: A Recurrent Neural Network based Sequence Model for Extractive S...
SummaRuNNer: A Recurrent Neural Network based Sequence Model for Extractive S...SummaRuNNer: A Recurrent Neural Network based Sequence Model for Extractive S...
SummaRuNNer: A Recurrent Neural Network based Sequence Model for Extractive S...
Sharath TS
 
Iiwas19 yamazaki slide
Iiwas19 yamazaki slideIiwas19 yamazaki slide
Iiwas19 yamazaki slide
Kotaro Yamazaki
 
NS-CUK Journal club: HBKim, Review on "Neural Graph Collaborative Filtering",...
NS-CUK Journal club: HBKim, Review on "Neural Graph Collaborative Filtering",...NS-CUK Journal club: HBKim, Review on "Neural Graph Collaborative Filtering",...
NS-CUK Journal club: HBKim, Review on "Neural Graph Collaborative Filtering",...
ssuser4b1f48
 
Ask Me Any Rating: A Content-based Recommender System based on Recurrent Neur...
Ask Me Any Rating: A Content-based Recommender System based on Recurrent Neur...Ask Me Any Rating: A Content-based Recommender System based on Recurrent Neur...
Ask Me Any Rating: A Content-based Recommender System based on Recurrent Neur...
Alessandro Suglia
 
Ask Me Any Rating: A Content-based Recommender System based on Recurrent Neur...
Ask Me Any Rating: A Content-based Recommender System based on Recurrent Neur...Ask Me Any Rating: A Content-based Recommender System based on Recurrent Neur...
Ask Me Any Rating: A Content-based Recommender System based on Recurrent Neur...
Claudio Greco
 
RAMSES: Robust Analytic Models for Science at Extreme Scales
RAMSES: Robust Analytic Models for Science at Extreme ScalesRAMSES: Robust Analytic Models for Science at Extreme Scales
RAMSES: Robust Analytic Models for Science at Extreme Scales
Ian Foster
 
Trajectory Transformer.pptx
Trajectory Transformer.pptxTrajectory Transformer.pptx
Trajectory Transformer.pptx
Seungeon Baek
 
network mining and representation learning
network mining and representation learningnetwork mining and representation learning
network mining and representation learning
sun peiyuan
 
A Scalable Dataflow Implementation of Curran's Approximation Algorithm
A Scalable Dataflow Implementation of Curran's Approximation AlgorithmA Scalable Dataflow Implementation of Curran's Approximation Algorithm
A Scalable Dataflow Implementation of Curran's Approximation Algorithm
NECST Lab @ Politecnico di Milano
 
Iterative Multi-document Neural Attention for Multiple Answer Prediction
Iterative Multi-document Neural Attention for Multiple Answer PredictionIterative Multi-document Neural Attention for Multiple Answer Prediction
Iterative Multi-document Neural Attention for Multiple Answer Prediction
Alessandro Suglia
 
Cvpr 2018 papers review (efficient computing)
Cvpr 2018 papers review (efficient computing)Cvpr 2018 papers review (efficient computing)
Cvpr 2018 papers review (efficient computing)
DonghyunKang12
 
Iterative Multi-document Neural Attention for Multiple Answer Prediction
Iterative Multi-document Neural Attention for Multiple Answer PredictionIterative Multi-document Neural Attention for Multiple Answer Prediction
Iterative Multi-document Neural Attention for Multiple Answer Prediction
Claudio Greco
 
Talk@rmit 09112017
Talk@rmit 09112017Talk@rmit 09112017
Talk@rmit 09112017
Shuai Zhang
 
LSH for
 Prediction Problem in Recommendation
LSH for
 Prediction Problem in RecommendationLSH for
 Prediction Problem in Recommendation
LSH for
 Prediction Problem in Recommendation
Maruf Aytekin
 
Ire presentation
Ire presentationIre presentation
Ire presentation
Raj Patel
 
Understanding Large Social Networks | IRE Major Project | Team 57 | LINE
Understanding Large Social Networks | IRE Major Project | Team 57 | LINEUnderstanding Large Social Networks | IRE Major Project | Team 57 | LINE
Understanding Large Social Networks | IRE Major Project | Team 57 | LINE
Raj Patel
 
Scientific Publication Retrieval in Linked Data
Scientific Publication Retrieval in Linked DataScientific Publication Retrieval in Linked Data
Scientific Publication Retrieval in Linked Data
AIMS (Agricultural Information Management Standards)
 
Super Resolution with OCR Optimization
Super Resolution with OCR OptimizationSuper Resolution with OCR Optimization
Super Resolution with OCR Optimization
niveditJain
 

Similar to [Introduction] Neural Network-Based Abstract Generation for Opinions and Arguments (20)

Neural Summarization by Extracting Sentences and Words
Neural Summarization by Extracting Sentences and WordsNeural Summarization by Extracting Sentences and Words
Neural Summarization by Extracting Sentences and Words
 
Reference Scope Identification of Citances Using Convolutional Neural Network
Reference Scope Identification of Citances Using Convolutional Neural NetworkReference Scope Identification of Citances Using Convolutional Neural Network
Reference Scope Identification of Citances Using Convolutional Neural Network
 
SummaRuNNer: A Recurrent Neural Network based Sequence Model for Extractive S...
SummaRuNNer: A Recurrent Neural Network based Sequence Model for Extractive S...SummaRuNNer: A Recurrent Neural Network based Sequence Model for Extractive S...
SummaRuNNer: A Recurrent Neural Network based Sequence Model for Extractive S...
 
Iiwas19 yamazaki slide
Iiwas19 yamazaki slideIiwas19 yamazaki slide
Iiwas19 yamazaki slide
 
NS-CUK Journal club: HBKim, Review on "Neural Graph Collaborative Filtering",...
NS-CUK Journal club: HBKim, Review on "Neural Graph Collaborative Filtering",...NS-CUK Journal club: HBKim, Review on "Neural Graph Collaborative Filtering",...
NS-CUK Journal club: HBKim, Review on "Neural Graph Collaborative Filtering",...
 
Ask Me Any Rating: A Content-based Recommender System based on Recurrent Neur...
Ask Me Any Rating: A Content-based Recommender System based on Recurrent Neur...Ask Me Any Rating: A Content-based Recommender System based on Recurrent Neur...
Ask Me Any Rating: A Content-based Recommender System based on Recurrent Neur...
 
Ask Me Any Rating: A Content-based Recommender System based on Recurrent Neur...
Ask Me Any Rating: A Content-based Recommender System based on Recurrent Neur...Ask Me Any Rating: A Content-based Recommender System based on Recurrent Neur...
Ask Me Any Rating: A Content-based Recommender System based on Recurrent Neur...
 
RAMSES: Robust Analytic Models for Science at Extreme Scales
RAMSES: Robust Analytic Models for Science at Extreme ScalesRAMSES: Robust Analytic Models for Science at Extreme Scales
RAMSES: Robust Analytic Models for Science at Extreme Scales
 
Trajectory Transformer.pptx
Trajectory Transformer.pptxTrajectory Transformer.pptx
Trajectory Transformer.pptx
 
network mining and representation learning
network mining and representation learningnetwork mining and representation learning
network mining and representation learning
 
A Scalable Dataflow Implementation of Curran's Approximation Algorithm
A Scalable Dataflow Implementation of Curran's Approximation AlgorithmA Scalable Dataflow Implementation of Curran's Approximation Algorithm
A Scalable Dataflow Implementation of Curran's Approximation Algorithm
 
Iterative Multi-document Neural Attention for Multiple Answer Prediction
Iterative Multi-document Neural Attention for Multiple Answer PredictionIterative Multi-document Neural Attention for Multiple Answer Prediction
Iterative Multi-document Neural Attention for Multiple Answer Prediction
 
Cvpr 2018 papers review (efficient computing)
Cvpr 2018 papers review (efficient computing)Cvpr 2018 papers review (efficient computing)
Cvpr 2018 papers review (efficient computing)
 
Iterative Multi-document Neural Attention for Multiple Answer Prediction
Iterative Multi-document Neural Attention for Multiple Answer PredictionIterative Multi-document Neural Attention for Multiple Answer Prediction
Iterative Multi-document Neural Attention for Multiple Answer Prediction
 
Talk@rmit 09112017
Talk@rmit 09112017Talk@rmit 09112017
Talk@rmit 09112017
 
LSH for
 Prediction Problem in Recommendation
LSH for
 Prediction Problem in RecommendationLSH for
 Prediction Problem in Recommendation
LSH for
 Prediction Problem in Recommendation
 
Ire presentation
Ire presentationIre presentation
Ire presentation
 
Understanding Large Social Networks | IRE Major Project | Team 57 | LINE
Understanding Large Social Networks | IRE Major Project | Team 57 | LINEUnderstanding Large Social Networks | IRE Major Project | Team 57 | LINE
Understanding Large Social Networks | IRE Major Project | Team 57 | LINE
 
Scientific Publication Retrieval in Linked Data
Scientific Publication Retrieval in Linked DataScientific Publication Retrieval in Linked Data
Scientific Publication Retrieval in Linked Data
 
Super Resolution with OCR Optimization
Super Resolution with OCR OptimizationSuper Resolution with OCR Optimization
Super Resolution with OCR Optimization
 

More from Kodaira Tomonori

Abstractive Text Summarization @Retrieva seminar
Abstractive Text Summarization @Retrieva seminarAbstractive Text Summarization @Retrieva seminar
Abstractive Text Summarization @Retrieva seminar
Kodaira Tomonori
 
AttSum: Joint Learning of Focusing and Summarization with Neural Attention
AttSum: Joint Learning of Focusing and Summarization with Neural AttentionAttSum: Joint Learning of Focusing and Summarization with Neural Attention
AttSum: Joint Learning of Focusing and Summarization with Neural Attention
Kodaira Tomonori
 
障害情報レポートに対する同時関連文章圧縮
障害情報レポートに対する同時関連文章圧縮障害情報レポートに対する同時関連文章圧縮
障害情報レポートに対する同時関連文章圧縮
Kodaira Tomonori
 
Poster: Controlled and Balanced Dataset for Japanese Lexical Simplification
Poster: Controlled and Balanced Dataset for Japanese Lexical SimplificationPoster: Controlled and Balanced Dataset for Japanese Lexical Simplification
Poster: Controlled and Balanced Dataset for Japanese Lexical Simplification
Kodaira Tomonori
 
[ポスター]均衡コーパスを用いた語彙平易化データセットの構築
[ポスター]均衡コーパスを用いた語彙平易化データセットの構築[ポスター]均衡コーパスを用いた語彙平易化データセットの構築
[ポスター]均衡コーパスを用いた語彙平易化データセットの構築
Kodaira Tomonori
 
Noise or additional information? Leveraging crowdsource annotation item agree...
Noise or additional information? Leveraging crowdsource annotation item agree...Noise or additional information? Leveraging crowdsource annotation item agree...
Noise or additional information? Leveraging crowdsource annotation item agree...
Kodaira Tomonori
 
語彙平易化システム評価のためのデータセット改良[ブースター]
語彙平易化システム評価のためのデータセット改良[ブースター]語彙平易化システム評価のためのデータセット改良[ブースター]
語彙平易化システム評価のためのデータセット改良[ブースター]
Kodaira Tomonori
 
語彙平易化システム評価のためのデータセットの改良[ポスター]
語彙平易化システム評価のためのデータセットの改良[ポスター]語彙平易化システム評価のためのデータセットの改良[ポスター]
語彙平易化システム評価のためのデータセットの改良[ポスター]
Kodaira Tomonori
 
PPDB 2.0: Better paraphrase ranking, 
fine-grained entailment relations,
word...
PPDB 2.0: Better paraphrase ranking, 
fine-grained entailment relations,
word...PPDB 2.0: Better paraphrase ranking, 
fine-grained entailment relations,
word...
PPDB 2.0: Better paraphrase ranking, 
fine-grained entailment relations,
word...
Kodaira Tomonori
 
WordNet-Based Lexical Simplification of Document
WordNet-Based Lexical Simplification of DocumentWordNet-Based Lexical Simplification of Document
WordNet-Based Lexical Simplification of Document
Kodaira Tomonori
 
文レベルの機械翻訳評価尺度に関する調査
文レベルの機械翻訳評価尺度に関する調査文レベルの機械翻訳評価尺度に関する調査
文レベルの機械翻訳評価尺度に関する調査
Kodaira Tomonori
 
Simp lex rankng based on contextual and psycholinguistic features
Simp lex rankng based on contextual and psycholinguistic featuresSimp lex rankng based on contextual and psycholinguistic features
Simp lex rankng based on contextual and psycholinguistic featuresKodaira Tomonori
 
Aligning sentences from standard wikipedia to simple wikipedia
Aligning sentences from standard wikipedia to simple wikipediaAligning sentences from standard wikipedia to simple wikipedia
Aligning sentences from standard wikipedia to simple wikipedia
Kodaira Tomonori
 
日本語の語彙平易化評価セットの構築
日本語の語彙平易化評価セットの構築日本語の語彙平易化評価セットの構築
日本語の語彙平易化評価セットの構築
Kodaira Tomonori
 
Improving text simplification language modeling using unsimplified text data
Improving text simplification language modeling using unsimplified text dataImproving text simplification language modeling using unsimplified text data
Improving text simplification language modeling using unsimplified text dataKodaira Tomonori
 
言い換えを用いたテキスト要約の自動評価
言い換えを用いたテキスト要約の自動評価言い換えを用いたテキスト要約の自動評価
言い換えを用いたテキスト要約の自動評価
Kodaira Tomonori
 
聾者向け文章読解支援における構文的言い換えの効果について
聾者向け文章読解支援における構文的言い換えの効果について聾者向け文章読解支援における構文的言い換えの効果について
聾者向け文章読解支援における構文的言い換えの効果について
Kodaira Tomonori
 
国語辞典を使った放送ニュースの名詞の平易化
国語辞典を使った放送ニュースの名詞の平易化国語辞典を使った放送ニュースの名詞の平易化
国語辞典を使った放送ニュースの名詞の平易化
Kodaira Tomonori
 

More from Kodaira Tomonori (18)

Abstractive Text Summarization @Retrieva seminar
Abstractive Text Summarization @Retrieva seminarAbstractive Text Summarization @Retrieva seminar
Abstractive Text Summarization @Retrieva seminar
 
AttSum: Joint Learning of Focusing and Summarization with Neural Attention
AttSum: Joint Learning of Focusing and Summarization with Neural AttentionAttSum: Joint Learning of Focusing and Summarization with Neural Attention
AttSum: Joint Learning of Focusing and Summarization with Neural Attention
 
障害情報レポートに対する同時関連文章圧縮
障害情報レポートに対する同時関連文章圧縮障害情報レポートに対する同時関連文章圧縮
障害情報レポートに対する同時関連文章圧縮
 
Poster: Controlled and Balanced Dataset for Japanese Lexical Simplification
Poster: Controlled and Balanced Dataset for Japanese Lexical SimplificationPoster: Controlled and Balanced Dataset for Japanese Lexical Simplification
Poster: Controlled and Balanced Dataset for Japanese Lexical Simplification
 
[ポスター]均衡コーパスを用いた語彙平易化データセットの構築
[ポスター]均衡コーパスを用いた語彙平易化データセットの構築[ポスター]均衡コーパスを用いた語彙平易化データセットの構築
[ポスター]均衡コーパスを用いた語彙平易化データセットの構築
 
Noise or additional information? Leveraging crowdsource annotation item agree...
Noise or additional information? Leveraging crowdsource annotation item agree...Noise or additional information? Leveraging crowdsource annotation item agree...
Noise or additional information? Leveraging crowdsource annotation item agree...
 
語彙平易化システム評価のためのデータセット改良[ブースター]
語彙平易化システム評価のためのデータセット改良[ブースター]語彙平易化システム評価のためのデータセット改良[ブースター]
語彙平易化システム評価のためのデータセット改良[ブースター]
 
語彙平易化システム評価のためのデータセットの改良[ポスター]
語彙平易化システム評価のためのデータセットの改良[ポスター]語彙平易化システム評価のためのデータセットの改良[ポスター]
語彙平易化システム評価のためのデータセットの改良[ポスター]
 
PPDB 2.0: Better paraphrase ranking, 
fine-grained entailment relations,
word...
PPDB 2.0: Better paraphrase ranking, 
fine-grained entailment relations,
word...PPDB 2.0: Better paraphrase ranking, 
fine-grained entailment relations,
word...
PPDB 2.0: Better paraphrase ranking, 
fine-grained entailment relations,
word...
 
WordNet-Based Lexical Simplification of Document
WordNet-Based Lexical Simplification of DocumentWordNet-Based Lexical Simplification of Document
WordNet-Based Lexical Simplification of Document
 
文レベルの機械翻訳評価尺度に関する調査
文レベルの機械翻訳評価尺度に関する調査文レベルの機械翻訳評価尺度に関する調査
文レベルの機械翻訳評価尺度に関する調査
 
Simp lex rankng based on contextual and psycholinguistic features
Simp lex rankng based on contextual and psycholinguistic featuresSimp lex rankng based on contextual and psycholinguistic features
Simp lex rankng based on contextual and psycholinguistic features
 
Aligning sentences from standard wikipedia to simple wikipedia
Aligning sentences from standard wikipedia to simple wikipediaAligning sentences from standard wikipedia to simple wikipedia
Aligning sentences from standard wikipedia to simple wikipedia
 
日本語の語彙平易化評価セットの構築
日本語の語彙平易化評価セットの構築日本語の語彙平易化評価セットの構築
日本語の語彙平易化評価セットの構築
 
Improving text simplification language modeling using unsimplified text data
Improving text simplification language modeling using unsimplified text dataImproving text simplification language modeling using unsimplified text data
Improving text simplification language modeling using unsimplified text data
 
言い換えを用いたテキスト要約の自動評価
言い換えを用いたテキスト要約の自動評価言い換えを用いたテキスト要約の自動評価
言い換えを用いたテキスト要約の自動評価
 
聾者向け文章読解支援における構文的言い換えの効果について
聾者向け文章読解支援における構文的言い換えの効果について聾者向け文章読解支援における構文的言い換えの効果について
聾者向け文章読解支援における構文的言い換えの効果について
 
国語辞典を使った放送ニュースの名詞の平易化
国語辞典を使った放送ニュースの名詞の平易化国語辞典を使った放送ニュースの名詞の平易化
国語辞典を使った放送ニュースの名詞の平易化
 

Recently uploaded

Topic: SICKLE CELL DISEASE IN CHILDREN-3.pdf
Topic: SICKLE CELL DISEASE IN CHILDREN-3.pdfTopic: SICKLE CELL DISEASE IN CHILDREN-3.pdf
Topic: SICKLE CELL DISEASE IN CHILDREN-3.pdf
TinyAnderson
 
Deep Software Variability and Frictionless Reproducibility
Deep Software Variability and Frictionless ReproducibilityDeep Software Variability and Frictionless Reproducibility
Deep Software Variability and Frictionless Reproducibility
University of Rennes, INSA Rennes, Inria/IRISA, CNRS
 
Chapter 12 - climate change and the energy crisis
Chapter 12 - climate change and the energy crisisChapter 12 - climate change and the energy crisis
Chapter 12 - climate change and the energy crisis
tonzsalvador2222
 
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...
Ana Luísa Pinho
 
原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样
原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样
原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样
yqqaatn0
 
ANAMOLOUS SECONDARY GROWTH IN DICOT ROOTS.pptx
ANAMOLOUS SECONDARY GROWTH IN DICOT ROOTS.pptxANAMOLOUS SECONDARY GROWTH IN DICOT ROOTS.pptx
ANAMOLOUS SECONDARY GROWTH IN DICOT ROOTS.pptx
RASHMI M G
 
What is greenhouse gasses and how many gasses are there to affect the Earth.
What is greenhouse gasses and how many gasses are there to affect the Earth.What is greenhouse gasses and how many gasses are there to affect the Earth.
What is greenhouse gasses and how many gasses are there to affect the Earth.
moosaasad1975
 
Sharlene Leurig - Enabling Onsite Water Use with Net Zero Water
Sharlene Leurig - Enabling Onsite Water Use with Net Zero WaterSharlene Leurig - Enabling Onsite Water Use with Net Zero Water
Sharlene Leurig - Enabling Onsite Water Use with Net Zero Water
Texas Alliance of Groundwater Districts
 
aziz sancar nobel prize winner: from mardin to nobel
aziz sancar nobel prize winner: from mardin to nobelaziz sancar nobel prize winner: from mardin to nobel
aziz sancar nobel prize winner: from mardin to nobel
İsa Badur
 
Cytokines and their role in immune regulation.pptx
Cytokines and their role in immune regulation.pptxCytokines and their role in immune regulation.pptx
Cytokines and their role in immune regulation.pptx
Hitesh Sikarwar
 
3D Hybrid PIC simulation of the plasma expansion (ISSS-14)
3D Hybrid PIC simulation of the plasma expansion (ISSS-14)3D Hybrid PIC simulation of the plasma expansion (ISSS-14)
3D Hybrid PIC simulation of the plasma expansion (ISSS-14)
David Osipyan
 
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...
Sérgio Sacani
 
Nucleic Acid-its structural and functional complexity.
Nucleic Acid-its structural and functional complexity.Nucleic Acid-its structural and functional complexity.
Nucleic Acid-its structural and functional complexity.
Nistarini College, Purulia (W.B) India
 
bordetella pertussis.................................ppt
bordetella pertussis.................................pptbordetella pertussis.................................ppt
bordetella pertussis.................................ppt
kejapriya1
 
Nucleophilic Addition of carbonyl compounds.pptx
Nucleophilic Addition of carbonyl  compounds.pptxNucleophilic Addition of carbonyl  compounds.pptx
Nucleophilic Addition of carbonyl compounds.pptx
SSR02
 
Eukaryotic Transcription Presentation.pptx
Eukaryotic Transcription Presentation.pptxEukaryotic Transcription Presentation.pptx
Eukaryotic Transcription Presentation.pptx
RitabrataSarkar3
 
Applied Science: Thermodynamics, Laws & Methodology.pdf
Applied Science: Thermodynamics, Laws & Methodology.pdfApplied Science: Thermodynamics, Laws & Methodology.pdf
Applied Science: Thermodynamics, Laws & Methodology.pdf
University of Hertfordshire
 
THEMATIC APPERCEPTION TEST(TAT) cognitive abilities, creativity, and critic...
THEMATIC  APPERCEPTION  TEST(TAT) cognitive abilities, creativity, and critic...THEMATIC  APPERCEPTION  TEST(TAT) cognitive abilities, creativity, and critic...
THEMATIC APPERCEPTION TEST(TAT) cognitive abilities, creativity, and critic...
Abdul Wali Khan University Mardan,kP,Pakistan
 
molar-distalization in orthodontics-seminar.pptx
molar-distalization in orthodontics-seminar.pptxmolar-distalization in orthodontics-seminar.pptx
molar-distalization in orthodontics-seminar.pptx
Anagha Prasad
 
Phenomics assisted breeding in crop improvement
Phenomics assisted breeding in crop improvementPhenomics assisted breeding in crop improvement
Phenomics assisted breeding in crop improvement
IshaGoswami9
 

Recently uploaded (20)

Topic: SICKLE CELL DISEASE IN CHILDREN-3.pdf
Topic: SICKLE CELL DISEASE IN CHILDREN-3.pdfTopic: SICKLE CELL DISEASE IN CHILDREN-3.pdf
Topic: SICKLE CELL DISEASE IN CHILDREN-3.pdf
 
Deep Software Variability and Frictionless Reproducibility
Deep Software Variability and Frictionless ReproducibilityDeep Software Variability and Frictionless Reproducibility
Deep Software Variability and Frictionless Reproducibility
 
Chapter 12 - climate change and the energy crisis
Chapter 12 - climate change and the energy crisisChapter 12 - climate change and the energy crisis
Chapter 12 - climate change and the energy crisis
 
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...
 
原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样
原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样
原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样
 
ANAMOLOUS SECONDARY GROWTH IN DICOT ROOTS.pptx
ANAMOLOUS SECONDARY GROWTH IN DICOT ROOTS.pptxANAMOLOUS SECONDARY GROWTH IN DICOT ROOTS.pptx
ANAMOLOUS SECONDARY GROWTH IN DICOT ROOTS.pptx
 
What is greenhouse gasses and how many gasses are there to affect the Earth.
What is greenhouse gasses and how many gasses are there to affect the Earth.What is greenhouse gasses and how many gasses are there to affect the Earth.
What is greenhouse gasses and how many gasses are there to affect the Earth.
 
Sharlene Leurig - Enabling Onsite Water Use with Net Zero Water
Sharlene Leurig - Enabling Onsite Water Use with Net Zero WaterSharlene Leurig - Enabling Onsite Water Use with Net Zero Water
Sharlene Leurig - Enabling Onsite Water Use with Net Zero Water
 
aziz sancar nobel prize winner: from mardin to nobel
aziz sancar nobel prize winner: from mardin to nobelaziz sancar nobel prize winner: from mardin to nobel
aziz sancar nobel prize winner: from mardin to nobel
 
Cytokines and their role in immune regulation.pptx
Cytokines and their role in immune regulation.pptxCytokines and their role in immune regulation.pptx
Cytokines and their role in immune regulation.pptx
 
3D Hybrid PIC simulation of the plasma expansion (ISSS-14)
3D Hybrid PIC simulation of the plasma expansion (ISSS-14)3D Hybrid PIC simulation of the plasma expansion (ISSS-14)
3D Hybrid PIC simulation of the plasma expansion (ISSS-14)
 
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...
 
Nucleic Acid-its structural and functional complexity.
Nucleic Acid-its structural and functional complexity.Nucleic Acid-its structural and functional complexity.
Nucleic Acid-its structural and functional complexity.
 
bordetella pertussis.................................ppt
bordetella pertussis.................................pptbordetella pertussis.................................ppt
bordetella pertussis.................................ppt
 
Nucleophilic Addition of carbonyl compounds.pptx
Nucleophilic Addition of carbonyl  compounds.pptxNucleophilic Addition of carbonyl  compounds.pptx
Nucleophilic Addition of carbonyl compounds.pptx
 
Eukaryotic Transcription Presentation.pptx
Eukaryotic Transcription Presentation.pptxEukaryotic Transcription Presentation.pptx
Eukaryotic Transcription Presentation.pptx
 
Applied Science: Thermodynamics, Laws & Methodology.pdf
Applied Science: Thermodynamics, Laws & Methodology.pdfApplied Science: Thermodynamics, Laws & Methodology.pdf
Applied Science: Thermodynamics, Laws & Methodology.pdf
 
THEMATIC APPERCEPTION TEST(TAT) cognitive abilities, creativity, and critic...
THEMATIC  APPERCEPTION  TEST(TAT) cognitive abilities, creativity, and critic...THEMATIC  APPERCEPTION  TEST(TAT) cognitive abilities, creativity, and critic...
THEMATIC APPERCEPTION TEST(TAT) cognitive abilities, creativity, and critic...
 
molar-distalization in orthodontics-seminar.pptx
molar-distalization in orthodontics-seminar.pptxmolar-distalization in orthodontics-seminar.pptx
molar-distalization in orthodontics-seminar.pptx
 
Phenomics assisted breeding in crop improvement
Phenomics assisted breeding in crop improvementPhenomics assisted breeding in crop improvement
Phenomics assisted breeding in crop improvement
 

[Introduction] Neural Network-Based Abstract Generation for Opinions and Arguments

  • 1. Neural Network-Based Abstract Generation for Opinions and Arguments Lu Wang and Wang Ling NAACL 2016 論文紹介 Presentation:Tomonori Kodaira 1
  • 3. Introduction • Authors present an attention-based NN model for generating abstractive summaries of opinionated text. • Their system takes as input a set of text units, and then outputs a one-sentence abstractive summary. • Two type of opinionated text:
 Movie reviews
 Arguments on controversial topic 3
  • 4. Introduction • Systems
 Attention-based model (Bahdanau et al. 2014)
 An importance-based sampling method. • The importance score of a text unit is estimated from a regression model with pairwise preference- based sampling. 4
  • 5. Data Collection Rotten Tomatoes (www.rottentomatoes.com) • There are professional critics and user-generated reviews. • For each movie has a one-sentence critic consensus. Data • 246,164 critics and their opinion consensus for 3,731 movies • train: 2,458, validation: 536, test: 737 movies Movie Reviews 5
  • 6. Data Collection idebate.org • This site is wikipedia-style website for gathering pro and con arguments on controversial issues. • Each point contains a one-sentence central claim. Data • 676 debates with 2,259 claims. • train: 450, validation: 67, test: 150 debates Arguments on controversial topic 6
  • 8. The Neural Network-Based Abstract Generation Model • summary y (composed by the sequence of words y1 , …, |y|. • input consists of an arbitrary number of reviews or arguments -> text units
 x = {x1, … , xM} • Each text unit xk is composed by a sequence of words xk 1 , …., xk |x k |. Problem Formulation 8
  • 9. The Neural Network-Based Abstract Generation Model • a sequence of word-level predictions: 
 log P(y|x) = ∑j=1 log P(yj| y1, ….yj-1, x)
 P(yi | y1 , …, yj-1, x) = sofmax(hj) • hj is RNNs state variable.
 hj = g(yj-1, hj-1, s) • g is LSTM network (Hochreiter and Schmidhuber, 1997) Decoder 9
  • 10. The Neural Network-Based Abstract Generation Model • LSTM • The model concatenates the representation of previous output word yj-1 and the input representation s as uj Decoder 10
  • 11. The Neural Network-Based Abstract Generation Model • The representation of input text units s is computed using an attention model (Bahdanau et al., 2014)
 -> ∑i=1aibi • Authors construct bi by building a bidirectional LSTM.
 They use the LSTM formulation by setting uj = xj. • ai = softmax(v(bi, hj-1)) 
 v(bi, hj-1) = Ws•tanh(Wcgbi + Whghj-1) Encoder 11
  • 12. The Neural Network-Based Abstract Generation Model Their input consists multiple separate text units. • one sequence z = There two problem: • The model is sensitive to the order of text units • z may contain thousands of words. Attention Over Multiple Inputs 12
  • 13. The Neural Network-Based Abstract Generation Model Sub-sampling from the input • They define an importance score f (xk) ∈ [0, 1] for each document xk. • K candidates are sampled Attention Over Multiple Inputs 13
  • 14. The Neural Network-Based Abstract Generation Model a ridge regression model and a regularizer. • Learning f(xk) = rk•w 
 by minimizing ||Rw - L ||2 2 + λ•||R’w-L’||2 2 + β•||w||2 2. • text unit xk is represented as an d-dimentional feature vector rk ∈ Rd. Importance Estimation 14
  • 15. The Neural Network-Based Abstract Generation Model • For testing phase, they re-rank the n-best summaries according to their cosine similarity with the input text units. • The one with the highest similarity is included in the final summary. Post-processing 15
  • 16. Experimental Setup • Data Preprocessing
 Stanford CoreNLP (Manning et al., 2014) • Pre trained Embeddings and Features
 word embedding: 300 dimension
 They extend their model with additional features. 16
  • 17. • Hyper parameters 
 The LSTMs are defined with states and cells of 150 dimensions.
 The attention: 100 dimensions. 
 Training is performed via Adagrad (Duchi et al. 2011) • Evaluation : BLEU • The importance-based sampling rate K is set of 5 • Decoding: beam serch -> 20 Experimental Setup 17
  • 18. Results • MRR (Mean Reciprocal Rank • NDCG (normalized Discounted Cumulative Gain) Importance Estimation Evaluation 18
  • 20. Results Human Evaluation on Summary Quality 20
  • 22. Conclusion • Authors presented a neural approach to generate abstractive summaries for opinionated text. • They employed an attention-based method that finds salient information from different input text units. • They deploy an importance-based sampling mechanism for model training. • Their system obtained sota results. 22