Generalized Entropy Regularization or: There’s Nothing Special about Label Smoothing

Clara Meister, Elizabeth Salesky, Ryan Cotterell


Abstract
Prior work has explored directly regularizing the output distributions of probabilistic models to alleviate peaky (i.e. over-confident) predictions, a common sign of overfitting. This class of techniques, of which label smoothing is one, has a connection to entropy regularization. Despite the consistent success of label smoothing across architectures and data sets in language generation tasks, two problems remain open: (1) there is little understanding of the underlying effects entropy regularizers have on models, and (2) the full space of entropy regularization techniques is largely unexplored. We introduce a parametric family of entropy regularizers, which includes label smoothing as a special case, and use it to gain a better understanding of the relationship between the entropy of a model and its performance on language generation tasks. We also find that variance in model performance can be explained largely by the resulting entropy of the model. Lastly, we find that label smoothing provably does not allow for sparsity in an output distribution, an undesirable property for language generation models, and therefore advise the use of other entropy regularization methods in its place.
Anthology ID:
2020.acl-main.615
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6870–6886
Language:
URL:
https://aclanthology.org/2020.acl-main.615
DOI:
10.18653/v1/2020.acl-main.615
Bibkey:
Cite (ACL):
Clara Meister, Elizabeth Salesky, and Ryan Cotterell. 2020. Generalized Entropy Regularization or: There’s Nothing Special about Label Smoothing. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 6870–6886, Online. Association for Computational Linguistics.
Cite (Informal):
Generalized Entropy Regularization or: There’s Nothing Special about Label Smoothing (Meister et al., ACL 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/2020.acl-main.615.pdf
Video:
 http://slideslive.com/38928899
Data
WMT 2014