Detecting Sockpuppetry on Wikipedia Using Meta-Learning

Luc Raszewski, Christine de Kock


Abstract
Malicious sockpuppet detection on Wikipedia is critical to preserving access to reliable information on the internet and preventing the spread of disinformation. Prior machine learning approaches rely on stylistic and meta-data features, but do not prioritise adaptability to author-specific behaviours. As a result, they struggle to effectively model the behaviour of specific sockpuppet-groups, especially when text data is limited. To address this, we propose the application of meta-learning, a machine learning technique designed to improve performance in data-scarce settings by training models across multiple tasks. Meta-learning optimises a model for rapid adaptation to the writing style of a new sockpuppet-group. Our results show that meta-learning significantly enhances the precision of predictions compared to pre-trained models, marking an advancement in combating sockpuppetry on open editing platforms. We release an updated dataset of sockpuppet investigations to foster future research in both sockpuppetry and meta-learning fields.
Anthology ID:
2025.acl-long.1083
Volume:
Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
22252–22264
Language:
URL:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.1083/
DOI:
Bibkey:
Cite (ACL):
Luc Raszewski and Christine de Kock. 2025. Detecting Sockpuppetry on Wikipedia Using Meta-Learning. In Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 22252–22264, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Detecting Sockpuppetry on Wikipedia Using Meta-Learning (Raszewski & de Kock, ACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.1083.pdf