CCTAA: A Reproducible Corpus for Chinese Authorship Attribution Research

Haining Wang, Allen Riddell


Abstract
Authorship attribution infers the likely author of an unsigned, single-authored document from a pool of candidates. Despite recent advances, a lack of standard, reproducible testbeds for Chinese language documents impedes progress. In this paper, we present the Chinese Cross-Topic Authorship Attribution (CCTAA) corpus. It is the first standard testbed for authorship attribution on contemporary Chinese prose. The cross-topic design and relatively inflexible genre of newswire contribute to an appropriate level of difficulty. It supports reproducible research by using pre-defined data splits. We show that a sequence classifier based on pre-trained Chinese RoBERTa embedding and a support vector machine classifier using function character n-gram frequency features perform below expectations on this task. The code for generating the corpus and reproducing the baselines is freely available at https://codeberg.org/haining/cctaa.
Anthology ID:
2022.lrec-1.633
Volume:
Proceedings of the Thirteenth Language Resources and Evaluation Conference
Month:
June
Year:
2022
Address:
Marseille, France
Editors:
Nicoletta Calzolari, Frédéric Béchet, Philippe Blache, Khalid Choukri, Christopher Cieri, Thierry Declerck, Sara Goggi, Hitoshi Isahara, Bente Maegaard, Joseph Mariani, Hélène Mazo, Jan Odijk, Stelios Piperidis
Venue:
LREC
SIG:
Publisher:
European Language Resources Association
Note:
Pages:
5889–5893
Language:
URL:
https://aclanthology.org/2022.lrec-1.633
DOI:
Bibkey:
Cite (ACL):
Haining Wang and Allen Riddell. 2022. CCTAA: A Reproducible Corpus for Chinese Authorship Attribution Research. In Proceedings of the Thirteenth Language Resources and Evaluation Conference, pages 5889–5893, Marseille, France. European Language Resources Association.
Cite (Informal):
CCTAA: A Reproducible Corpus for Chinese Authorship Attribution Research (Wang & Riddell, LREC 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-1/2022.lrec-1.633.pdf