Task Transfer and Domain Adaptation for Zero-Shot Question Answering

Xiang Pan, Alex Sheng, David Shimshoni, Aditya Singhal, Sara Rosenthal, Avirup Sil


Abstract
Pretrained language models have shown success in various areas of natural language processing, including reading comprehension tasks. However, when applying machine learning methods to new domains, labeled data may not always be available. To address this, we use supervised pretraining on source-domain data to reduce sample complexity on domainspecific downstream tasks. We evaluate zeroshot performance on domain-specific reading comprehension tasks by combining task transfer with domain adaptation to fine-tune a pretrained model with no labelled data from the target task. Our approach outperforms DomainAdaptive Pretraining on downstream domainspecific reading comprehension tasks in 3 out of 4 domains.
Anthology ID:
2022.deeplo-1.12
Volume:
Proceedings of the Third Workshop on Deep Learning for Low-Resource Natural Language Processing
Month:
July
Year:
2022
Address:
Hybrid
Venue:
DeepLo
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
110–116
Language:
URL:
https://aclanthology.org/2022.deeplo-1.12
DOI:
10.18653/v1/2022.deeplo-1.12
Bibkey:
Cite (ACL):
Xiang Pan, Alex Sheng, David Shimshoni, Aditya Singhal, Sara Rosenthal, and Avirup Sil. 2022. Task Transfer and Domain Adaptation for Zero-Shot Question Answering. In Proceedings of the Third Workshop on Deep Learning for Low-Resource Natural Language Processing, pages 110–116, Hybrid. Association for Computational Linguistics.
Cite (Informal):
Task Transfer and Domain Adaptation for Zero-Shot Question Answering (Pan et al., DeepLo 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/auto-file-uploads/2022.deeplo-1.12.pdf
Video:
 https://preview.aclanthology.org/auto-file-uploads/2022.deeplo-1.12.mp4
Code
 adityaarunsinghal/Domain-Adaptation
Data
NewsQASQuAD