SlideShare a Scribd company logo
1 of 16
Yanghoon Kim, Hwanhee Lee, Joongbo Shin and Kyomin Jung
Improving Neural Question Generation
using Answer Separation
김양훈
Background
Neural question generation (NQG)
- Generating a question from a given text passage with deep neural networks.
Importance of NQG
- Generating questions for educational materials.
- Generating questions for improving QA systems.
2
Original passage: John Francis O’hara was elected president of Notre Dame in 1934.
Generated question 1: Who was elected president of Notre Dame in 1934?
Generated question 2: When was John Francis O’hara elected president of Notre Dame?
Problem
Previous NQG systems suffer from a critical problem
- Some models don’t take the question target into account.
- RNNs often follow a shallow generation process.
- Some models can’t well grasp the target answer(question target) .
- A sophisticated proportion of generated questions include word in the target
answer.
3
Original passage: John Francis O’Hara was elected president of Notre Dame in 1934.
Given target answer: John Francis O’hara
Correctly Generated question: Who was elected president of Notre Dame in 1934?
Incorrectly generated question: Who was elected John Francis?
Contribution
We propose answer-separated seq2seq
- Treats the target answer(question target) and the passage separately.
- Prevent the generated question from including words in the target answer.
- Better capture the information from both the target answer and the passage
- We propose keyword-net
- Model is consistently aware of the target answer.
- Extract the key information in the target answer.
- We use retrieval style word generator
- Take the word meaning into account when generating words.
4
Task Definition
5
Model
Base model
- We use RNN encoder-decoder with attention
Answer-separated seq2seq consist of
- Answer-separated passage encoder
- Target answer encoder
- Answer-separated decoder
- keyword-net
- Retrieval style word generator
6
Model
Answer-separated passage encoder
- A simple preprocessing of the input passage
- Original passage: Steve Jobs is the founder of Apple.
- Masked passage: Steve Jobs is the <a> .
- A one-layer bi-LSTM
Answer encoder
- A one-layer bi-LSTM
7
Model
Answer-separated decoder
- A one-layer LSTM
- keyword-net
- Let the model consistently be aware of the target answer.
- Extract key information.
- Passage: Steve Jobs is the founder of Apple
- Target answer: founder of Apple
8
Model
Answer-separated decoder
- Retrieval style word generator by (Ma et al. 2018)*
- seq2seq has tendency to memorize the sequence pattern rather than
reflecting word meanings
- The word generator produces words by querying the distributed word
representations.
*Query and Output: Generating Words by Querying Distributed Word Representations for Paraphrase Generation 9
Experiment
Data
- Processed version of SQuAD 1.1
- Data split 1: 70,484/10,570/11,877 (train/dev/test)
- Data split 2: 86,635/8,965/8,964
Evaluation
Our model(Ass2s) outperform the previous state-of-the-art model
10
Experiment
Impact of answer separation
- Ability to capture target answer
- We checked if the target answer is included in the generated question
- AP : Answer position Feature (BIO scheme)
- (Song et al. 2018) used the copy mechanism.
11
Our model has better ability to generate the right question given the target answer
Experiment
Impact of answer separation
- Interrogative word prediction
- “What” takes up more than half of the whole training set
- “Which” : “Which year” can be represented as “When”
- “why”, “yes/no” : only takes up 1.5% and 1.2% of the training set.
12
Our model has better ability to predict the question type for the given target answer
Experiment
Impact of answer separation
- Attention from <answer>
- (a) is the attention matrix from our model
- (b) is the attention matrix from seq2seq + AP
- <a> token gives the highest attention weights to the interrogative word “who” in (a)
13
Experiment
Question generation for machine comprehension
- Use named entities as target answers, generate synthetic data for machine
comprehension system(QA net by Google).
- ALL : Evaluation result of SQuAD dev set(10k)
- NER : Evaluation result of partial SQuAD dev set(4k)
- answers of single named entity
14
Conclusion
We propose Answer-separated seq2seq for NQG
- Separate utilization of target answer and the passage(without target answer)
- By masking the target answer inside the passage
- By using keyword-net to extract key feature from target answer
- By using retrieval style word generator to capture word meaning information
- Our model can
- Reduce the probability that the target answer is included by the generated question
- Generate fluent and right question for the given passage and the target answer
- Better inference the type of question
Thank you for listening!
Code, paper: https://yanghoonkim.github.io
Questions: ad26kr@snu.ac.kr

More Related Content

What's hot

Deep Learning for NLP (without Magic) - Richard Socher and Christopher Manning
Deep Learning for NLP (without Magic) - Richard Socher and Christopher ManningDeep Learning for NLP (without Magic) - Richard Socher and Christopher Manning
Deep Learning for NLP (without Magic) - Richard Socher and Christopher Manning
BigDataCloud
 
How Do Users Like This Feature? A Fine Grained Sentiment Analysis of App Revi...
How Do Users Like This Feature? A Fine Grained Sentiment Analysis of App Revi...How Do Users Like This Feature? A Fine Grained Sentiment Analysis of App Revi...
How Do Users Like This Feature? A Fine Grained Sentiment Analysis of App Revi...
Walid Maalej
 
Exploratory Study of Slack Q&A Chats as a Mining Source for Software Engineer...
Exploratory Study of Slack Q&A Chats as a Mining Source for Software Engineer...Exploratory Study of Slack Q&A Chats as a Mining Source for Software Engineer...
Exploratory Study of Slack Q&A Chats as a Mining Source for Software Engineer...
Preetha Chatterjee
 
Seq2seq Model to Tokenize the Chinese Language
Seq2seq Model to Tokenize the Chinese LanguageSeq2seq Model to Tokenize the Chinese Language
Seq2seq Model to Tokenize the Chinese Language
Jinho Choi
 
Deep Learning, Where Are You Going?
Deep Learning, Where Are You Going?Deep Learning, Where Are You Going?
Deep Learning, Where Are You Going?
NAVER Engineering
 

What's hot (20)

[KDD 2018 tutorial] End to-end goal-oriented question answering systems
[KDD 2018 tutorial] End to-end goal-oriented question answering systems[KDD 2018 tutorial] End to-end goal-oriented question answering systems
[KDD 2018 tutorial] End to-end goal-oriented question answering systems
 
Chatbots and Deep Learning
Chatbots and Deep LearningChatbots and Deep Learning
Chatbots and Deep Learning
 
Practical Deep Learning for NLP
Practical Deep Learning for NLP Practical Deep Learning for NLP
Practical Deep Learning for NLP
 
Deep Learning for NLP (without Magic) - Richard Socher and Christopher Manning
Deep Learning for NLP (without Magic) - Richard Socher and Christopher ManningDeep Learning for NLP (without Magic) - Richard Socher and Christopher Manning
Deep Learning for NLP (without Magic) - Richard Socher and Christopher Manning
 
Extracting Archival-Quality Information from Software-Related Chats
Extracting Archival-Quality Information from Software-Related ChatsExtracting Archival-Quality Information from Software-Related Chats
Extracting Archival-Quality Information from Software-Related Chats
 
How Do Users Like This Feature? A Fine Grained Sentiment Analysis of App Revi...
How Do Users Like This Feature? A Fine Grained Sentiment Analysis of App Revi...How Do Users Like This Feature? A Fine Grained Sentiment Analysis of App Revi...
How Do Users Like This Feature? A Fine Grained Sentiment Analysis of App Revi...
 
Machine Learning in NLP
Machine Learning in NLPMachine Learning in NLP
Machine Learning in NLP
 
Exploratory Study of Slack Q&A Chats as a Mining Source for Software Engineer...
Exploratory Study of Slack Q&A Chats as a Mining Source for Software Engineer...Exploratory Study of Slack Q&A Chats as a Mining Source for Software Engineer...
Exploratory Study of Slack Q&A Chats as a Mining Source for Software Engineer...
 
Deeplearning NLP
Deeplearning NLPDeeplearning NLP
Deeplearning NLP
 
Presentation of Domain Specific Question Answering System Using N-gram Approach.
Presentation of Domain Specific Question Answering System Using N-gram Approach.Presentation of Domain Specific Question Answering System Using N-gram Approach.
Presentation of Domain Specific Question Answering System Using N-gram Approach.
 
Deep Learning, an interactive introduction for NLP-ers
Deep Learning, an interactive introduction for NLP-ersDeep Learning, an interactive introduction for NLP-ers
Deep Learning, an interactive introduction for NLP-ers
 
ACL 2018 Recap
ACL 2018 RecapACL 2018 Recap
ACL 2018 Recap
 
Finding Help with Programming Errors: An Exploratory Study of Novice Software...
Finding Help with Programming Errors: An Exploratory Study of Novice Software...Finding Help with Programming Errors: An Exploratory Study of Novice Software...
Finding Help with Programming Errors: An Exploratory Study of Novice Software...
 
FaCoY – A Code-to-Code Search Engine
FaCoY – A Code-to-Code Search EngineFaCoY – A Code-to-Code Search Engine
FaCoY – A Code-to-Code Search Engine
 
Seq2seq Model to Tokenize the Chinese Language
Seq2seq Model to Tokenize the Chinese LanguageSeq2seq Model to Tokenize the Chinese Language
Seq2seq Model to Tokenize the Chinese Language
 
Anthiil Inside workshop on NLP
Anthiil Inside workshop on NLPAnthiil Inside workshop on NLP
Anthiil Inside workshop on NLP
 
Deep Learning, Where Are You Going?
Deep Learning, Where Are You Going?Deep Learning, Where Are You Going?
Deep Learning, Where Are You Going?
 
Arabic Question Answering: Challenges, Tasks, Approaches, Test-sets, Tools, A...
Arabic Question Answering: Challenges, Tasks, Approaches, Test-sets, Tools, A...Arabic Question Answering: Challenges, Tasks, Approaches, Test-sets, Tools, A...
Arabic Question Answering: Challenges, Tasks, Approaches, Test-sets, Tools, A...
 
Chatbot ppt
Chatbot pptChatbot ppt
Chatbot ppt
 
American sign language recognizer
American sign language recognizerAmerican sign language recognizer
American sign language recognizer
 

Similar to Improving Neural Question Generation using Answer Separation

feras_kalita_mcgrory_2015
feras_kalita_mcgrory_2015feras_kalita_mcgrory_2015
feras_kalita_mcgrory_2015
Conor McGrory
 
Aspect Based Sentiment Analysis
Aspect Based Sentiment AnalysisAspect Based Sentiment Analysis
Aspect Based Sentiment Analysis
Gaurav kumar
 

Similar to Improving Neural Question Generation using Answer Separation (20)

Improving neural question generation using answer separation
Improving neural question generation using answer separationImproving neural question generation using answer separation
Improving neural question generation using answer separation
 
Attention-based Models (DLAI D8L 2017 UPC Deep Learning for Artificial Intell...
Attention-based Models (DLAI D8L 2017 UPC Deep Learning for Artificial Intell...Attention-based Models (DLAI D8L 2017 UPC Deep Learning for Artificial Intell...
Attention-based Models (DLAI D8L 2017 UPC Deep Learning for Artificial Intell...
 
Natural Language Generation / Stanford cs224n 2019w lecture 15 Review
Natural Language Generation / Stanford cs224n 2019w lecture 15 ReviewNatural Language Generation / Stanford cs224n 2019w lecture 15 Review
Natural Language Generation / Stanford cs224n 2019w lecture 15 Review
 
Deep Learning Enabled Question Answering System to Automate Corporate Helpdesk
Deep Learning Enabled Question Answering System to Automate Corporate HelpdeskDeep Learning Enabled Question Answering System to Automate Corporate Helpdesk
Deep Learning Enabled Question Answering System to Automate Corporate Helpdesk
 
WISS QA Do it yourself Question answering over Linked Data
WISS QA Do it yourself Question answering over Linked DataWISS QA Do it yourself Question answering over Linked Data
WISS QA Do it yourself Question answering over Linked Data
 
Building a Meta-search Engine
Building a Meta-search EngineBuilding a Meta-search Engine
Building a Meta-search Engine
 
Python week 2 2019 2020 for g10 by eng.osama ghandour
Python week 2 2019 2020 for g10 by eng.osama ghandourPython week 2 2019 2020 for g10 by eng.osama ghandour
Python week 2 2019 2020 for g10 by eng.osama ghandour
 
rlhf.pdf
rlhf.pdfrlhf.pdf
rlhf.pdf
 
Seminar2017
Seminar2017Seminar2017
Seminar2017
 
Wecp all-india-test-series-program-brochure
Wecp all-india-test-series-program-brochureWecp all-india-test-series-program-brochure
Wecp all-india-test-series-program-brochure
 
Wecp all-india-test-series-program-brochure
Wecp all-india-test-series-program-brochureWecp all-india-test-series-program-brochure
Wecp all-india-test-series-program-brochure
 
Python week 1 2020-2021
Python week 1 2020-2021Python week 1 2020-2021
Python week 1 2020-2021
 
The State of #NLProc
The State of #NLProcThe State of #NLProc
The State of #NLProc
 
feras_kalita_mcgrory_2015
feras_kalita_mcgrory_2015feras_kalita_mcgrory_2015
feras_kalita_mcgrory_2015
 
Abigail See - 2017 - Get To The Point: Summarization with Pointer-Generator N...
Abigail See - 2017 - Get To The Point: Summarization with Pointer-Generator N...Abigail See - 2017 - Get To The Point: Summarization with Pointer-Generator N...
Abigail See - 2017 - Get To The Point: Summarization with Pointer-Generator N...
 
Automating Tinder w/ Eigenfaces and StanfordNLP
Automating Tinder w/ Eigenfaces and StanfordNLPAutomating Tinder w/ Eigenfaces and StanfordNLP
Automating Tinder w/ Eigenfaces and StanfordNLP
 
Pycon India 2018 Natural Language Processing Workshop
Pycon India 2018   Natural Language Processing WorkshopPycon India 2018   Natural Language Processing Workshop
Pycon India 2018 Natural Language Processing Workshop
 
Adversarial learning for neural dialogue generation
Adversarial learning for neural dialogue generationAdversarial learning for neural dialogue generation
Adversarial learning for neural dialogue generation
 
Aspect Based Sentiment Analysis
Aspect Based Sentiment AnalysisAspect Based Sentiment Analysis
Aspect Based Sentiment Analysis
 
Deep Learning Automated Helpdesk
Deep Learning Automated HelpdeskDeep Learning Automated Helpdesk
Deep Learning Automated Helpdesk
 

Recently uploaded

Heat Units in plant physiology and the importance of Growing Degree days
Heat Units in plant physiology and the importance of Growing Degree daysHeat Units in plant physiology and the importance of Growing Degree days
Heat Units in plant physiology and the importance of Growing Degree days
Brahmesh Reddy B R
 
Chemistry Data Delivery from the US-EPA Center for Computational Toxicology a...
Chemistry Data Delivery from the US-EPA Center for Computational Toxicology a...Chemistry Data Delivery from the US-EPA Center for Computational Toxicology a...
Chemistry Data Delivery from the US-EPA Center for Computational Toxicology a...
US Environmental Protection Agency (EPA), Center for Computational Toxicology and Exposure
 
Warming the earth and the atmosphere.pptx
Warming the earth and the atmosphere.pptxWarming the earth and the atmosphere.pptx
Warming the earth and the atmosphere.pptx
GlendelCaroz
 
Electricity and Circuits for Grade 9 students
Electricity and Circuits for Grade 9 studentsElectricity and Circuits for Grade 9 students
Electricity and Circuits for Grade 9 students
levieagacer
 

Recently uploaded (20)

Mining Activity and Investment Opportunity in Myanmar.pptx
Mining Activity and Investment Opportunity in Myanmar.pptxMining Activity and Investment Opportunity in Myanmar.pptx
Mining Activity and Investment Opportunity in Myanmar.pptx
 
POST TRANSCRIPTIONAL GENE SILENCING-AN INTRODUCTION.pptx
POST TRANSCRIPTIONAL GENE SILENCING-AN INTRODUCTION.pptxPOST TRANSCRIPTIONAL GENE SILENCING-AN INTRODUCTION.pptx
POST TRANSCRIPTIONAL GENE SILENCING-AN INTRODUCTION.pptx
 
GBSN - Microbiology (Unit 4) Concept of Asepsis
GBSN - Microbiology (Unit 4) Concept of AsepsisGBSN - Microbiology (Unit 4) Concept of Asepsis
GBSN - Microbiology (Unit 4) Concept of Asepsis
 
MSC IV_Forensic medicine - Mechanical injuries.pdf
MSC IV_Forensic medicine - Mechanical injuries.pdfMSC IV_Forensic medicine - Mechanical injuries.pdf
MSC IV_Forensic medicine - Mechanical injuries.pdf
 
GBSN - Microbiology (Unit 7) Microbiology in Everyday Life
GBSN - Microbiology (Unit 7) Microbiology in Everyday LifeGBSN - Microbiology (Unit 7) Microbiology in Everyday Life
GBSN - Microbiology (Unit 7) Microbiology in Everyday Life
 
Heat Units in plant physiology and the importance of Growing Degree days
Heat Units in plant physiology and the importance of Growing Degree daysHeat Units in plant physiology and the importance of Growing Degree days
Heat Units in plant physiology and the importance of Growing Degree days
 
Chemistry Data Delivery from the US-EPA Center for Computational Toxicology a...
Chemistry Data Delivery from the US-EPA Center for Computational Toxicology a...Chemistry Data Delivery from the US-EPA Center for Computational Toxicology a...
Chemistry Data Delivery from the US-EPA Center for Computational Toxicology a...
 
MODERN PHYSICS_REPORTING_QUANTA_.....pdf
MODERN PHYSICS_REPORTING_QUANTA_.....pdfMODERN PHYSICS_REPORTING_QUANTA_.....pdf
MODERN PHYSICS_REPORTING_QUANTA_.....pdf
 
FORENSIC CHEMISTRY ARSON INVESTIGATION.pdf
FORENSIC CHEMISTRY ARSON INVESTIGATION.pdfFORENSIC CHEMISTRY ARSON INVESTIGATION.pdf
FORENSIC CHEMISTRY ARSON INVESTIGATION.pdf
 
X-rays from a Central “Exhaust Vent” of the Galactic Center Chimney
X-rays from a Central “Exhaust Vent” of the Galactic Center ChimneyX-rays from a Central “Exhaust Vent” of the Galactic Center Chimney
X-rays from a Central “Exhaust Vent” of the Galactic Center Chimney
 
Information science research with large language models: between science and ...
Information science research with large language models: between science and ...Information science research with large language models: between science and ...
Information science research with large language models: between science and ...
 
GBSN - Microbiology (Unit 5) Concept of isolation
GBSN - Microbiology (Unit 5) Concept of isolationGBSN - Microbiology (Unit 5) Concept of isolation
GBSN - Microbiology (Unit 5) Concept of isolation
 
Warming the earth and the atmosphere.pptx
Warming the earth and the atmosphere.pptxWarming the earth and the atmosphere.pptx
Warming the earth and the atmosphere.pptx
 
Efficient spin-up of Earth System Models usingsequence acceleration
Efficient spin-up of Earth System Models usingsequence accelerationEfficient spin-up of Earth System Models usingsequence acceleration
Efficient spin-up of Earth System Models usingsequence acceleration
 
Manganese‐RichSandstonesasanIndicatorofAncientOxic LakeWaterConditionsinGale...
Manganese‐RichSandstonesasanIndicatorofAncientOxic  LakeWaterConditionsinGale...Manganese‐RichSandstonesasanIndicatorofAncientOxic  LakeWaterConditionsinGale...
Manganese‐RichSandstonesasanIndicatorofAncientOxic LakeWaterConditionsinGale...
 
Electricity and Circuits for Grade 9 students
Electricity and Circuits for Grade 9 studentsElectricity and Circuits for Grade 9 students
Electricity and Circuits for Grade 9 students
 
RACEMIzATION AND ISOMERISATION completed.pptx
RACEMIzATION AND ISOMERISATION completed.pptxRACEMIzATION AND ISOMERISATION completed.pptx
RACEMIzATION AND ISOMERISATION completed.pptx
 
PARENTAL CARE IN FISHES.pptx for 5th sem
PARENTAL CARE IN FISHES.pptx for 5th semPARENTAL CARE IN FISHES.pptx for 5th sem
PARENTAL CARE IN FISHES.pptx for 5th sem
 
TEST BANK for Organic Chemistry 6th Edition.pdf
TEST BANK for Organic Chemistry 6th Edition.pdfTEST BANK for Organic Chemistry 6th Edition.pdf
TEST BANK for Organic Chemistry 6th Edition.pdf
 
PHOTOSYNTHETIC BACTERIA (OXYGENIC AND ANOXYGENIC)
PHOTOSYNTHETIC BACTERIA  (OXYGENIC AND ANOXYGENIC)PHOTOSYNTHETIC BACTERIA  (OXYGENIC AND ANOXYGENIC)
PHOTOSYNTHETIC BACTERIA (OXYGENIC AND ANOXYGENIC)
 

Improving Neural Question Generation using Answer Separation

  • 1. Yanghoon Kim, Hwanhee Lee, Joongbo Shin and Kyomin Jung Improving Neural Question Generation using Answer Separation 김양훈
  • 2. Background Neural question generation (NQG) - Generating a question from a given text passage with deep neural networks. Importance of NQG - Generating questions for educational materials. - Generating questions for improving QA systems. 2 Original passage: John Francis O’hara was elected president of Notre Dame in 1934. Generated question 1: Who was elected president of Notre Dame in 1934? Generated question 2: When was John Francis O’hara elected president of Notre Dame?
  • 3. Problem Previous NQG systems suffer from a critical problem - Some models don’t take the question target into account. - RNNs often follow a shallow generation process. - Some models can’t well grasp the target answer(question target) . - A sophisticated proportion of generated questions include word in the target answer. 3 Original passage: John Francis O’Hara was elected president of Notre Dame in 1934. Given target answer: John Francis O’hara Correctly Generated question: Who was elected president of Notre Dame in 1934? Incorrectly generated question: Who was elected John Francis?
  • 4. Contribution We propose answer-separated seq2seq - Treats the target answer(question target) and the passage separately. - Prevent the generated question from including words in the target answer. - Better capture the information from both the target answer and the passage - We propose keyword-net - Model is consistently aware of the target answer. - Extract the key information in the target answer. - We use retrieval style word generator - Take the word meaning into account when generating words. 4
  • 6. Model Base model - We use RNN encoder-decoder with attention Answer-separated seq2seq consist of - Answer-separated passage encoder - Target answer encoder - Answer-separated decoder - keyword-net - Retrieval style word generator 6
  • 7. Model Answer-separated passage encoder - A simple preprocessing of the input passage - Original passage: Steve Jobs is the founder of Apple. - Masked passage: Steve Jobs is the <a> . - A one-layer bi-LSTM Answer encoder - A one-layer bi-LSTM 7
  • 8. Model Answer-separated decoder - A one-layer LSTM - keyword-net - Let the model consistently be aware of the target answer. - Extract key information. - Passage: Steve Jobs is the founder of Apple - Target answer: founder of Apple 8
  • 9. Model Answer-separated decoder - Retrieval style word generator by (Ma et al. 2018)* - seq2seq has tendency to memorize the sequence pattern rather than reflecting word meanings - The word generator produces words by querying the distributed word representations. *Query and Output: Generating Words by Querying Distributed Word Representations for Paraphrase Generation 9
  • 10. Experiment Data - Processed version of SQuAD 1.1 - Data split 1: 70,484/10,570/11,877 (train/dev/test) - Data split 2: 86,635/8,965/8,964 Evaluation Our model(Ass2s) outperform the previous state-of-the-art model 10
  • 11. Experiment Impact of answer separation - Ability to capture target answer - We checked if the target answer is included in the generated question - AP : Answer position Feature (BIO scheme) - (Song et al. 2018) used the copy mechanism. 11 Our model has better ability to generate the right question given the target answer
  • 12. Experiment Impact of answer separation - Interrogative word prediction - “What” takes up more than half of the whole training set - “Which” : “Which year” can be represented as “When” - “why”, “yes/no” : only takes up 1.5% and 1.2% of the training set. 12 Our model has better ability to predict the question type for the given target answer
  • 13. Experiment Impact of answer separation - Attention from <answer> - (a) is the attention matrix from our model - (b) is the attention matrix from seq2seq + AP - <a> token gives the highest attention weights to the interrogative word “who” in (a) 13
  • 14. Experiment Question generation for machine comprehension - Use named entities as target answers, generate synthetic data for machine comprehension system(QA net by Google). - ALL : Evaluation result of SQuAD dev set(10k) - NER : Evaluation result of partial SQuAD dev set(4k) - answers of single named entity 14
  • 15. Conclusion We propose Answer-separated seq2seq for NQG - Separate utilization of target answer and the passage(without target answer) - By masking the target answer inside the passage - By using keyword-net to extract key feature from target answer - By using retrieval style word generator to capture word meaning information - Our model can - Reduce the probability that the target answer is included by the generated question - Generate fluent and right question for the given passage and the target answer - Better inference the type of question
  • 16. Thank you for listening! Code, paper: https://yanghoonkim.github.io Questions: ad26kr@snu.ac.kr