Towards Imperceptible Document Manipulations against Neural Ranking Models

Xuanang Chen, Ben He, Zheng Ye, Le Sun, Yingfei Sun


Abstract
Adversarial attacks have gained traction in order to identify vulnerabilities in neural ranking models (NRMs), but current attack methods often introduce noticeable errors. Moreover, current methods rely heavily on using a well-imitated surrogate NRM to guarantee the attack effect, making them difficult to use in practice. This paper proposes a framework called Imperceptible DocumEnt Manipulation (IDEM) to produce adversarial documents that are less noticeable to both algorithms and humans. IDEM instructs a well-established generative language model like BART to generate error-free connection sentences, and employs a separate position-wise merging strategy to balance between relevance and coherence of the perturbed text. Evaluation results on the MS MARCO benchmark demonstrate that IDEM outperforms strong baselines while preserving fluency and correctness of the target documents. Furthermore, the separation of adversarial text generation from the surrogate NRM makes IDEM more robust and less affected by the quality of the surrogate NRM.
Anthology ID:
2023.findings-acl.416
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6648–6664
Language:
URL:
https://aclanthology.org/2023.findings-acl.416
DOI:
10.18653/v1/2023.findings-acl.416
Bibkey:
Cite (ACL):
Xuanang Chen, Ben He, Zheng Ye, Le Sun, and Yingfei Sun. 2023. Towards Imperceptible Document Manipulations against Neural Ranking Models. In Findings of the Association for Computational Linguistics: ACL 2023, pages 6648–6664, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Towards Imperceptible Document Manipulations against Neural Ranking Models (Chen et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-2/2023.findings-acl.416.pdf