Abstract
Existing textual adversarial attacks usually utilize the gradient or prediction confidence to generate adversarial examples, making it hard to be deployed in real-world applications. To this end, we consider a rarely investigated but more rigorous setting, namely hard-label attack, in which the attacker can only access the prediction label. In particular, we find we can learn the importance of different words via the change on prediction label caused by word substitutions on the adversarial examples. Based on this observation, we propose a novel adversarial attack, termed Text Hard-label attacker (TextHacker). TextHacker randomly perturbs lots of words to craft an adversarial example. Then, TextHacker adopts a hybrid local search algorithm with the estimation of word importance from the attack history to minimize the adversarial perturbation. Extensive evaluations for text classification and textual entailment show that TextHacker significantly outperforms existing hard-label attacks regarding the attack performance as well as adversary quality.- Anthology ID:
- 2022.findings-emnlp.44
- Volume:
- Findings of the Association for Computational Linguistics: EMNLP 2022
- Month:
- December
- Year:
- 2022
- Address:
- Abu Dhabi, United Arab Emirates
- Editors:
- Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 622–637
- Language:
- URL:
- https://preview.aclanthology.org/icon-24-ingestion/2022.findings-emnlp.44/
- DOI:
- 10.18653/v1/2022.findings-emnlp.44
- Cite (ACL):
- Zhen Yu, Xiaosen Wang, Wanxiang Che, and Kun He. 2022. TextHacker: Learning based Hybrid Local Search Algorithm for Text Hard-label Adversarial Attack. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 622–637, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
- Cite (Informal):
- TextHacker: Learning based Hybrid Local Search Algorithm for Text Hard-label Adversarial Attack (Yu et al., Findings 2022)
- PDF:
- https://preview.aclanthology.org/icon-24-ingestion/2022.findings-emnlp.44.pdf