Tchebycheff Procedure for Multi-task Text Classification

Yuren Mao, Shuang Yun, Weiwei Liu, Bo Du


Abstract
Multi-task Learning methods have achieved great progress in text classification. However, existing methods assume that multi-task text classification problems are convex multiobjective optimization problems, which is unrealistic in real-world applications. To address this issue, this paper presents a novel Tchebycheff procedure to optimize the multi-task classification problems without convex assumption. The extensive experiments back up our theoretical analysis and validate the superiority of our proposals.
Anthology ID:
2020.acl-main.388
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4217–4226
Language:
URL:
https://aclanthology.org/2020.acl-main.388
DOI:
10.18653/v1/2020.acl-main.388
Bibkey:
Cite (ACL):
Yuren Mao, Shuang Yun, Weiwei Liu, and Bo Du. 2020. Tchebycheff Procedure for Multi-task Text Classification. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 4217–4226, Online. Association for Computational Linguistics.
Cite (Informal):
Tchebycheff Procedure for Multi-task Text Classification (Mao et al., ACL 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-2/2020.acl-main.388.pdf
Software:
 2020.acl-main.388.Software.zip
Video:
 http://slideslive.com/38928751