Abstract
In this paper, we empirically investigate adversarial attack on NMT from two aspects: languages (the source vs. the target language) and positions (front vs. rear). For autoregressive NMT models that generate target words from left to right, we observe that adversarial attack on the source language is more effective than on the target language, and that attacking front positions of target sentences or positions of source sentences aligned to the front positions of corresponding target sentences is more effective than attacking other positions. We further exploit the attention distribution of the victim model to attack source sentences at positions that have a strong association with front target words. Experiment results demonstrate that our attention-based adversarial attack is more effective than adversarial attacks by sampling positions randomly or according to gradients.- Anthology ID:
- 2021.acl-short.58
- Volume:
- Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers)
- Month:
- August
- Year:
- 2021
- Address:
- Online
- Editors:
- Chengqing Zong, Fei Xia, Wenjie Li, Roberto Navigli
- Venues:
- ACL | IJCNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 454–460
- Language:
- URL:
- https://aclanthology.org/2021.acl-short.58
- DOI:
- 10.18653/v1/2021.acl-short.58
- Cite (ACL):
- Zhiyuan Zeng and Deyi Xiong. 2021. An Empirical Study on Adversarial Attack on NMT: Languages and Positions Matter. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pages 454–460, Online. Association for Computational Linguistics.
- Cite (Informal):
- An Empirical Study on Adversarial Attack on NMT: Languages and Positions Matter (Zeng & Xiong, ACL-IJCNLP 2021)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-1/2021.acl-short.58.pdf