Gated Multi-Task Network for Text Classification

Liqiang Xiao, Honglun Zhang, Wenqing Chen


Abstract
Multi-task learning with Convolutional Neural Network (CNN) has shown great success in many Natural Language Processing (NLP) tasks. This success can be largely attributed to the feature sharing by fusing some layers among tasks. However, most existing approaches just fully or proportionally share the features without distinguishing the helpfulness of them. By that the network would be confused by the helpless even harmful features, generating undesired interference between tasks. In this paper, we introduce gate mechanism into multi-task CNN and propose a new Gated Sharing Unit, which can filter the feature flows between tasks and greatly reduce the interference. Experiments on 9 text classification datasets shows that our approach can learn selection rules automatically and gain a great improvement over strong baselines.
Anthology ID:
N18-2114
Volume:
Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers)
Month:
June
Year:
2018
Address:
New Orleans, Louisiana
Editors:
Marilyn Walker, Heng Ji, Amanda Stent
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
726–731
Language:
URL:
https://aclanthology.org/N18-2114
DOI:
10.18653/v1/N18-2114
Bibkey:
Cite (ACL):
Liqiang Xiao, Honglun Zhang, and Wenqing Chen. 2018. Gated Multi-Task Network for Text Classification. In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers), pages 726–731, New Orleans, Louisiana. Association for Computational Linguistics.
Cite (Informal):
Gated Multi-Task Network for Text Classification (Xiao et al., NAACL 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/N18-2114.pdf