Non-Existent Relationship: Fact-Aware Multi-Level Machine-Generated Text Detection

Yang Wu, Ruijia Wang, Jie Wu


Abstract
Machine-generated text detection is critical for preventing misuse of large language models (LLMs). Although LLMs have recently excelled at mimicking human writing styles, they still suffer from factual hallucinations manifested as entity-relation inconsistencies with real-world knowledge. Current detection methods inadequately address the authenticity of the entity graph, which is a key discriminative feature for identifying machine-generated content. To bridge this gap, we propose a fact-aware model that assesses discrepancies between textual and factual entity graphs through graph comparison. In order to holistically analyze context information, our approach employs hierarchical feature extraction with gating units, enabling the adaptive fusion of multi-grained features from entity, sentence, and document levels. Experimental results on three public datasets demonstrate that our approach outperforms the state-of-the-art methods. Interpretability analysis shows that our model can capture the differences in entity graphs between machine-generated and human-written texts.
Anthology ID:
2025.emnlp-main.186
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3757–3768
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.186/
DOI:
Bibkey:
Cite (ACL):
Yang Wu, Ruijia Wang, and Jie Wu. 2025. Non-Existent Relationship: Fact-Aware Multi-Level Machine-Generated Text Detection. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 3757–3768, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Non-Existent Relationship: Fact-Aware Multi-Level Machine-Generated Text Detection (Wu et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.186.pdf
Checklist:
 2025.emnlp-main.186.checklist.pdf