A Word is Worth A Thousand Dollars: Adversarial Attack on Tweets Fools Stock Prediction

Yong Xie, Dakuo Wang, Pin-Yu Chen, Jinjun Xiong, Sijia Liu, Oluwasanmi Koyejo


Abstract
More and more investors and machine learning models rely on social media (e.g., Twitter and Reddit) to gather information and predict movements stock prices. Although text-based models are known to be vulnerable to adversarial attacks, whether stock prediction models have similar vulnerability given necessary constraints is underexplored. In this paper, we experiment with a variety of adversarial attack configurations to fool three stock prediction victim models. We address the task of adversarial generation by solving combinatorial optimization problems with semantics and budget constraints. Our results show that the proposed attack method can achieve consistent success rates and cause significant monetary loss in trading simulation by simply concatenating a perturbed but semantically similar tweet.
Anthology ID:
2022.naacl-main.43
Volume:
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
587–599
Language:
URL:
https://aclanthology.org/2022.naacl-main.43
DOI:
10.18653/v1/2022.naacl-main.43
Bibkey:
Cite (ACL):
Yong Xie, Dakuo Wang, Pin-Yu Chen, Jinjun Xiong, Sijia Liu, and Oluwasanmi Koyejo. 2022. A Word is Worth A Thousand Dollars: Adversarial Attack on Tweets Fools Stock Prediction. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 587–599, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
A Word is Worth A Thousand Dollars: Adversarial Attack on Tweets Fools Stock Prediction (Xie et al., NAACL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/2022.naacl-main.43.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-4/2022.naacl-main.43.mp4
Code
 yonxie/advfintweet
Data
StockNet