Abstract
We propose a novel approach for machine-generated text detection using a RoBERTa model with weighted layer averaging and AdaLoRA for parameter-efficient fine-tuning. Our method incorporates information from all model layers, capturing diverse linguistic cues beyond those accessible from the final layer alone. To mitigate potential overfitting and improve generalizability, we leverage AdaLoRA, which injects trainable low-rank matrices into each Transformer layer, significantly reducing the number of trainable parameters. Furthermore, we employ data mixing to ensure our model encounters text from various domains and generators during training, enhancing its ability to generalize to unseen data. This work highlights the potential of combining layer-wise information with parameter-efficient fine-tuning and data mixing for effective machine-generated text detection.- Anthology ID:
- 2024.semeval-1.230
- Volume:
- Proceedings of the 18th International Workshop on Semantic Evaluation (SemEval-2024)
- Month:
- June
- Year:
- 2024
- Address:
- Mexico City, Mexico
- Editors:
- Atul Kr. Ojha, A. Seza Doğruöz, Harish Tayyar Madabushi, Giovanni Da San Martino, Sara Rosenthal, Aiala Rosá
- Venue:
- SemEval
- SIG:
- SIGLEX
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 1623–1626
- Language:
- URL:
- https://preview.aclanthology.org/add_missing_videos/2024.semeval-1.230/
- DOI:
- 10.18653/v1/2024.semeval-1.230
- Cite (ACL):
- Ayan Datta, Aryan Chandramania, and Radhika Mamidi. 2024. Weighted Layer Averaging RoBERTa for Black-Box Machine-Generated Text Detection. In Proceedings of the 18th International Workshop on Semantic Evaluation (SemEval-2024), pages 1623–1626, Mexico City, Mexico. Association for Computational Linguistics.
- Cite (Informal):
- Weighted Layer Averaging RoBERTa for Black-Box Machine-Generated Text Detection (Datta et al., SemEval 2024)
- PDF:
- https://preview.aclanthology.org/add_missing_videos/2024.semeval-1.230.pdf