Text AutoAugment: Learning Compositional Augmentation Policy for Text Classification

Shuhuai Ren, Jinchao Zhang, Lei Li, Xu Sun, Jie Zhou


Abstract
Data augmentation aims to enrich training samples for alleviating the overfitting issue in low-resource or class-imbalanced situations. Traditional methods first devise task-specific operations such as Synonym Substitute, then preset the corresponding parameters such as the substitution rate artificially, which require a lot of prior knowledge and are prone to fall into the sub-optimum. Besides, the number of editing operations is limited in the previous methods, which decreases the diversity of the augmented data and thus restricts the performance gain. To overcome the above limitations, we propose a framework named Text AutoAugment (TAA) to establish a compositional and learnable paradigm for data augmentation. We regard a combination of various operations as an augmentation policy and utilize an efficient Bayesian Optimization algorithm to automatically search for the best policy, which substantially improves the generalization capability of models. Experiments on six benchmark datasets show that TAA boosts classification accuracy in low-resource and class-imbalanced regimes by an average of 8.8% and 9.7%, respectively, outperforming strong baselines.
Anthology ID:
2021.emnlp-main.711
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9029–9043
Language:
URL:
https://aclanthology.org/2021.emnlp-main.711
DOI:
10.18653/v1/2021.emnlp-main.711
Bibkey:
Cite (ACL):
Shuhuai Ren, Jinchao Zhang, Lei Li, Xu Sun, and Jie Zhou. 2021. Text AutoAugment: Learning Compositional Augmentation Policy for Text Classification. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 9029–9043, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Text AutoAugment: Learning Compositional Augmentation Policy for Text Classification (Ren et al., EMNLP 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl-24-ws-corrections/2021.emnlp-main.711.pdf
Video:
 https://preview.aclanthology.org/naacl-24-ws-corrections/2021.emnlp-main.711.mp4
Code
 lancopku/text-autoaugment
Data
IMDb Movie ReviewsSSTSST-2SST-5