Towards Unsupervised Language Understanding and Generation by Joint Dual Learning

Shang-Yu Su, Chao-Wei Huang, Yun-Nung Chen


Abstract
In modular dialogue systems, natural language understanding (NLU) and natural language generation (NLG) are two critical components, where NLU extracts the semantics from the given texts and NLG is to construct corresponding natural language sentences based on the input semantic representations. However, the dual property between understanding and generation has been rarely explored. The prior work is the first attempt that utilized the duality between NLU and NLG to improve the performance via a dual supervised learning framework. However, the prior work still learned both components in a supervised manner; instead, this paper introduces a general learning framework to effectively exploit such duality, providing flexibility of incorporating both supervised and unsupervised learning algorithms to train language understanding and generation models in a joint fashion. The benchmark experiments demonstrate that the proposed approach is capable of boosting the performance of both NLU and NLG. The source code is available at: https://github.com/MiuLab/DuaLUG.
Anthology ID:
2020.acl-main.63
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
671–680
Language:
URL:
https://aclanthology.org/2020.acl-main.63
DOI:
10.18653/v1/2020.acl-main.63
Bibkey:
Cite (ACL):
Shang-Yu Su, Chao-Wei Huang, and Yun-Nung Chen. 2020. Towards Unsupervised Language Understanding and Generation by Joint Dual Learning. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 671–680, Online. Association for Computational Linguistics.
Cite (Informal):
Towards Unsupervised Language Understanding and Generation by Joint Dual Learning (Su et al., ACL 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2020.acl-main.63.pdf
Video:
 http://slideslive.com/38929439
Code
 MiuLab/DuaLUG