Diversifying Dialog Generation via Adaptive Label Smoothing

Yida Wang, Yinhe Zheng, Yong Jiang, Minlie Huang


Abstract
Neural dialogue generation models trained with the one-hot target distribution suffer from the over-confidence issue, which leads to poor generation diversity as widely reported in the literature. Although existing approaches such as label smoothing can alleviate this issue, they fail to adapt to diverse dialog contexts. In this paper, we propose an Adaptive Label Smoothing (AdaLabel) approach that can adaptively estimate a target label distribution at each time step for different contexts. The maximum probability in the predicted distribution is used to modify the soft target distribution produced by a novel light-weight bi-directional decoder module. The resulting target distribution is aware of both previous and future contexts and is adjusted to avoid over-training the dialogue model. Our model can be trained in an endto-end manner. Extensive experiments on two benchmark datasets show that our approach outperforms various competitive baselines in producing diverse responses.
Anthology ID:
2021.acl-long.272
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
Month:
August
Year:
2021
Address:
Online
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3507–3520
Language:
URL:
https://aclanthology.org/2021.acl-long.272
DOI:
10.18653/v1/2021.acl-long.272
Bibkey:
Cite (ACL):
Yida Wang, Yinhe Zheng, Yong Jiang, and Minlie Huang. 2021. Diversifying Dialog Generation via Adaptive Label Smoothing. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 3507–3520, Online. Association for Computational Linguistics.
Cite (Informal):
Diversifying Dialog Generation via Adaptive Label Smoothing (Wang et al., ACL-IJCNLP 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2021.acl-long.272.pdf
Video:
 https://preview.aclanthology.org/emnlp-22-attachments/2021.acl-long.272.mp4
Code
 lemon234071/AdaLabel
Data
DailyDialog