Unsupervised Candidate Answer Extraction through Differentiable Masker-Reconstructor Model

Zhuoer Wang, Yicheng Wang, Ziwei Zhu, James Caverlee


Abstract
Question generation is a widely used data augmentation approach with extensive applications, and extracting qualified candidate answers from context passages is a critical step for most question generation systems. However, existing methods for candidate answer extraction are reliant on linguistic rules or annotated data that face the partial annotation issue and challenges in generalization. To overcome these limitations, we propose a novel unsupervised candidate answer extraction approach that leverages the inherent structure of context passages through a Differentiable Masker-Reconstructor (DMR) Model with the enforcement of self-consistency for picking up salient information tokens. We curated two datasets with exhaustively-annotated answers and benchmark a comprehensive set of supervised and unsupervised candidate answer extraction methods. We demonstrate the effectiveness of the DMR model by showing its performance is superior among unsupervised methods and comparable to supervised methods.
Anthology ID:
2023.findings-emnlp.379
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5712–5723
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.379
DOI:
10.18653/v1/2023.findings-emnlp.379
Bibkey:
Cite (ACL):
Zhuoer Wang, Yicheng Wang, Ziwei Zhu, and James Caverlee. 2023. Unsupervised Candidate Answer Extraction through Differentiable Masker-Reconstructor Model. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 5712–5723, Singapore. Association for Computational Linguistics.
Cite (Informal):
Unsupervised Candidate Answer Extraction through Differentiable Masker-Reconstructor Model (Wang et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/2023.findings-emnlp.379.pdf