On Generating Fact-Infused Question Variations

Arthur Deschamps, Sujatha Das Gollapalli, See-Kiong Ng


Abstract
To fully model human-like ability to ask questions, automatic question generation (QG) models must be able to produce multiple expressions of the same question with different levels of detail. Unfortunately, existing datasets available for learning QG do not include paraphrases or question variations affecting a model’s ability to learn this capability. We present FIRS, a dataset containing human-generated fact-infused rewrites of questions from the widely-used SQuAD dataset to address this limitation. Questions in FIRS were obtained by combining a given question with facts of entities referenced in the question. We study a double encoder-decoder model, Fact-Infused Question Generator (FIQG), for learning to generate fact-infused questions from a given question. Experimental results show that FIQG effectively incorporates information from facts to add more detail to a given question. To the best of our knowledge, ours is the first study to present fact-infusion as a novel form of question paraphrasing.
Anthology ID:
2021.ranlp-1.39
Volume:
Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2021)
Month:
September
Year:
2021
Address:
Held Online
Venue:
RANLP
SIG:
Publisher:
INCOMA Ltd.
Note:
Pages:
335–345
Language:
URL:
https://aclanthology.org/2021.ranlp-1.39
DOI:
Bibkey:
Cite (ACL):
Arthur Deschamps, Sujatha Das Gollapalli, and See-Kiong Ng. 2021. On Generating Fact-Infused Question Variations. In Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2021), pages 335–345, Held Online. INCOMA Ltd..
Cite (Informal):
On Generating Fact-Infused Question Variations (Deschamps et al., RANLP 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/update-css-js/2021.ranlp-1.39.pdf
Code
 nus-ids/ranlp21-fiqv
Data
HotpotQASQuAD