多模块联合的阅读理解候选句抽取(Evidence sentence extraction for reading comprehension based on multi-module)

Yu Ji (吉宇), Xiaoyue Wang (王笑月), Ru Li (李茹), Shaoru Guo (郭少茹), Yong Guan (关勇)


Abstract
机器阅读理解作为自然语言理解的关键任务,受到国内外学者广泛关注。针对多项选择型阅读理解中无线索标注且涉及多步推理致使候选句抽取困难的问题,本文提出一种基于多模块联合的候选句抽取模型。首先采用部分标注数据微调预训练模型;其次通过TF-IDF递归式抽取多跳推理问题中的候选句;最后结合无监督方式进一步筛选模型预测结果降低冗余性。本文在高考语文选择题及RACE数据集上进行验证,在候选句抽取中,本文方法相比于最优基线模型F1值提升3.44%,在下游答题任务中采用候选句作为模型输入较全文输入时准确率分别提高3.68%和3.6%,上述结果证实本文所提方法有效性。
Anthology ID:
2020.ccl-1.23
Volume:
Proceedings of the 19th Chinese National Conference on Computational Linguistics
Month:
October
Year:
2020
Address:
Haikou, China
Editors:
Maosong Sun (孙茂松), Sujian Li (李素建), Yue Zhang (张岳), Yang Liu (刘洋)
Venue:
CCL
SIG:
Publisher:
Chinese Information Processing Society of China
Note:
Pages:
236–245
Language:
Chinese
URL:
https://aclanthology.org/2020.ccl-1.23
DOI:
Bibkey:
Cite (ACL):
Yu Ji, Xiaoyue Wang, Ru Li, Shaoru Guo, and Yong Guan. 2020. 多模块联合的阅读理解候选句抽取(Evidence sentence extraction for reading comprehension based on multi-module). In Proceedings of the 19th Chinese National Conference on Computational Linguistics, pages 236–245, Haikou, China. Chinese Information Processing Society of China.
Cite (Informal):
多模块联合的阅读理解候选句抽取(Evidence sentence extraction for reading comprehension based on multi-module) (Ji et al., CCL 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/2020.ccl-1.23.pdf
Data
RACE