Discursive Circuits: How Do Language Models Understand Discourse Relations?

Yisong Miao, Min-Yen Kan


Abstract
Which components in transformer language models are responsible for discourse understanding? We hypothesize that sparse computational graphs, termed as discursive circuits, control how models process discourse relations. Unlike simpler tasks, discourse relations involve longer spans and complex reasoning. To make circuit discovery feasible, we introduce a task called Completion under Discourse Relation (CuDR), where a model completes a discourse given a specified relation. To support this task, we construct a corpus of minimal contrastive pairs tailored for activation patching in circuit discovery. Experiments show that sparse circuits (≈0.2% of a full GPT-2 model) recover discourse understanding in the English PDTB-based CuDR task. These circuits generalize well to unseen discourse frameworks such as RST and SDRT. Further analysis shows lower layers capture linguistic features such as lexical semantics and coreference, while upper layers encode discourse-level abstractions. Feature utility is consistent across frameworks (e.g., coreference supports Expansion-like relations).
Anthology ID:
2025.emnlp-main.1657
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
32558–32577
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1657/
DOI:
Bibkey:
Cite (ACL):
Yisong Miao and Min-Yen Kan. 2025. Discursive Circuits: How Do Language Models Understand Discourse Relations?. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 32558–32577, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Discursive Circuits: How Do Language Models Understand Discourse Relations? (Miao & Kan, EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1657.pdf
Checklist:
 2025.emnlp-main.1657.checklist.pdf