Multi-source Meta Transfer for Low Resource Multiple-Choice Question Answering

Ming Yan, Hao Zhang, Di Jin, Joey Tianyi Zhou


Abstract
Multiple-choice question answering (MCQA) is one of the most challenging tasks in machine reading comprehension since it requires more advanced reading comprehension skills such as logical reasoning, summarization, and arithmetic operations. Unfortunately, most existing MCQA datasets are small in size, which increases the difficulty of model learning and generalization. To address this challenge, we propose a multi-source meta transfer (MMT) for low-resource MCQA. In this framework, we first extend meta learning by incorporating multiple training sources to learn a generalized feature representation across domains. To bridge the distribution gap between training sources and the target, we further introduce the meta transfer that can be integrated into the multi-source meta training. More importantly, the proposed MMT is independent of backbone language models. Extensive experiments demonstrate the superiority of MMT over state-of-the-arts, and continuous improvements can be achieved on different backbone networks on both supervised and unsupervised domain adaptation settings.
Anthology ID:
2020.acl-main.654
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7331–7341
Language:
URL:
https://aclanthology.org/2020.acl-main.654
DOI:
10.18653/v1/2020.acl-main.654
Bibkey:
Cite (ACL):
Ming Yan, Hao Zhang, Di Jin, and Joey Tianyi Zhou. 2020. Multi-source Meta Transfer for Low Resource Multiple-Choice Question Answering. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 7331–7341, Online. Association for Computational Linguistics.
Cite (Informal):
Multi-source Meta Transfer for Low Resource Multiple-Choice Question Answering (Yan et al., ACL 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp22-frontmatter/2020.acl-main.654.pdf
Video:
 http://slideslive.com/38929127