Enhance Robustness of Sequence Labelling with Masked Adversarial Training

Luoxin Chen, Xinyue Liu, Weitong Ruan, Jianhua Lu


Abstract
Adversarial training (AT) has shown strong regularization effects on deep learning algorithms by introducing small input perturbations to improve model robustness. In language tasks, adversarial training brings word-level robustness by adding input noise, which is beneficial for text classification. However, it lacks sufficient contextual information enhancement and thus is less useful for sequence labelling tasks such as chunking and named entity recognition (NER). To address this limitation, we propose masked adversarial training (MAT) to improve robustness from contextual information in sequence labelling. MAT masks or replaces some words in the sentence when computing adversarial loss from perturbed inputs and consequently enhances model robustness using more context-level information. In our experiments, our method shows significant improvements on accuracy and robustness of sequence labelling. By further incorporating with ELMo embeddings, our model achieves better or comparable results to state-of-the-art on CoNLL 2000 and 2003 benchmarks using much less parameters.
Anthology ID:
2020.findings-emnlp.28
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2020
Month:
November
Year:
2020
Address:
Online
Editors:
Trevor Cohn, Yulan He, Yang Liu
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
297–302
Language:
URL:
https://aclanthology.org/2020.findings-emnlp.28
DOI:
10.18653/v1/2020.findings-emnlp.28
Bibkey:
Cite (ACL):
Luoxin Chen, Xinyue Liu, Weitong Ruan, and Jianhua Lu. 2020. Enhance Robustness of Sequence Labelling with Masked Adversarial Training. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 297–302, Online. Association for Computational Linguistics.
Cite (Informal):
Enhance Robustness of Sequence Labelling with Masked Adversarial Training (Chen et al., Findings 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2020.findings-emnlp.28.pdf
Video:
 https://slideslive.com/38940127
Data
CoNLLCoNLL 2003CoNLL-2000