Abstract
Thanks to the state-of-the-art Large Language Models (LLMs), language generation has reached outstanding levels. These models are capable of generating high quality content, thus making it a challenging task to detect generated text from human-written content. Despite the advantages provided by Natural Language Generation, the inability to distinguish automatically generated text can raise ethical concerns in terms of authenticity. Consequently, it is important to design and develop methodologies to detect artificial content. In our work, we present some classification models constructed by ensembling transformer models such as Sci-BERT, DeBERTa and XLNet, with Convolutional Neural Networks (CNNs). Our experiments demonstrate that the considered ensemble architectures surpass the performance of the individual transformer models for classification. Furthermore, the proposed SciBERT-CNN ensemble model produced an F1-score of 98.36% on the ALTA shared task 2023 data.- Anthology ID:
- 2023.alta-1.11
- Volume:
- Proceedings of the 21st Annual Workshop of the Australasian Language Technology Association
- Month:
- November
- Year:
- 2023
- Address:
- Melbourne, Australia
- Editors:
- Smaranda Muresan, Vivian Chen, Kennington Casey, Vandyke David, Dethlefs Nina, Inoue Koji, Ekstedt Erik, Ultes Stefan
- Venue:
- ALTA
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 107–111
- Language:
- URL:
- https://aclanthology.org/2023.alta-1.11
- DOI:
- Cite (ACL):
- Vijini Liyanage and Davide Buscaldi. 2023. An Ensemble Method Based on the Combination of Transformers with Convolutional Neural Networks to Detect Artificially Generated Text. In Proceedings of the 21st Annual Workshop of the Australasian Language Technology Association, pages 107–111, Melbourne, Australia. Association for Computational Linguistics.
- Cite (Informal):
- An Ensemble Method Based on the Combination of Transformers with Convolutional Neural Networks to Detect Artificially Generated Text (Liyanage & Buscaldi, ALTA 2023)
- PDF:
- https://preview.aclanthology.org/emnlp-22-attachments/2023.alta-1.11.pdf