Bidirectional Domain Adaptation Using Weighted Multi-Task Learning

Daniel Dakota, Zeeshan Ali Sayyed, Sandra Kübler


Abstract
Domain adaption in syntactic parsing is still a significant challenge. We address the issue of data imbalance between the in-domain and out-of-domain treebank typically used for the problem. We define domain adaptation as a Multi-task learning (MTL) problem, which allows us to train two parsers, one for each do-main. Our results show that the MTL approach is beneficial for the smaller treebank. For the larger treebank, we need to use loss weighting in order to avoid a decrease in performance be-low the single task. In order to determine towhat degree the data imbalance between two domains and the domain differences affect results, we also carry out an experiment with two imbalanced in-domain treebanks and show that loss weighting also improves performance in an in-domain setting. Given loss weighting in MTL, we can improve results for both parsers.
Anthology ID:
2021.iwpt-1.10
Volume:
Proceedings of the 17th International Conference on Parsing Technologies and the IWPT 2021 Shared Task on Parsing into Enhanced Universal Dependencies (IWPT 2021)
Month:
August
Year:
2021
Address:
Online
Editors:
Stephan Oepen, Kenji Sagae, Reut Tsarfaty, Gosse Bouma, Djamé Seddah, Daniel Zeman
Venue:
IWPT
SIG:
SIGPARSE
Publisher:
Association for Computational Linguistics
Note:
Pages:
93–105
Language:
URL:
https://aclanthology.org/2021.iwpt-1.10
DOI:
10.18653/v1/2021.iwpt-1.10
Bibkey:
Cite (ACL):
Daniel Dakota, Zeeshan Ali Sayyed, and Sandra Kübler. 2021. Bidirectional Domain Adaptation Using Weighted Multi-Task Learning. In Proceedings of the 17th International Conference on Parsing Technologies and the IWPT 2021 Shared Task on Parsing into Enhanced Universal Dependencies (IWPT 2021), pages 93–105, Online. Association for Computational Linguistics.
Cite (Informal):
Bidirectional Domain Adaptation Using Weighted Multi-Task Learning (Dakota et al., IWPT 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/2021.iwpt-1.10.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-4/2021.iwpt-1.10.mp4
Code
 zeeshansayyed/multiparser