Unsupervised Question Answering via Answer Diversifying

Yuxiang Nie, Heyan Huang, Zewen Chi, Xian-Ling Mao


Abstract
Unsupervised question answering is an attractive task due to its independence on labeled data. Previous works usually make use of heuristic rules as well as pre-trained models to construct data and train QA models. However, most of these works regard named entity (NE) as the only answer type, which ignores the high diversity of answers in the real world. To tackle this problem, we propose a novel unsupervised method by diversifying answers, named DiverseQA. Specifically, the proposed method is composed of three modules: data construction, data augmentation and denoising filter. Firstly, the data construction module extends the extracted named entity into a longer sentence constituent as the new answer span to construct a QA dataset with diverse answers. Secondly, the data augmentation module adopts an answer-type dependent data augmentation process via adversarial training in the embedding level. Thirdly, the denoising filter module is designed to alleviate the noise in the constructed data. Extensive experiments show that the proposed method outperforms previous unsupervised models on five benchmark datasets, including SQuADv1.1, NewsQA, TriviaQA, BioASQ, and DuoRC. Besides, the proposed method shows strong performance in the few-shot learning setting.
Anthology ID:
2022.coling-1.149
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
1732–1742
Language:
URL:
https://aclanthology.org/2022.coling-1.149
DOI:
Bibkey:
Cite (ACL):
Yuxiang Nie, Heyan Huang, Zewen Chi, and Xian-Ling Mao. 2022. Unsupervised Question Answering via Answer Diversifying. In Proceedings of the 29th International Conference on Computational Linguistics, pages 1732–1742, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
Unsupervised Question Answering via Answer Diversifying (Nie et al., COLING 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/auto-file-uploads/2022.coling-1.149.pdf
Code
 jerrrynie/diverseqa
Data
DuoRCMRQANatural QuestionsNewsQASQuADTriviaQA