CLaC at DISRPT 2025: Hierarchical Adapters for Cross-Framework Multi-lingual Discourse Relation Classification

Nawar Turk, Daniele Comitogianni, Leila Kosseim


Abstract
We present our submission to Task 3 (Discourse Relation Classification) of the DISRPT 2025 shared task. Task 3 introduces a unified set of 17 discourse relation labels across 39 corpora in 16 languages and six discourse frameworks, posing significant multilingual and cross‐formalism challenges. We first benchmark the task by fine‐tuning multilingual BERT‐based models (mBERT, XLM‐RoBERTa‐Base, and XLM‐RoBERTa‐Large) with two argument‐ordering strategies and progressive unfreezing ratios to establish strong baselines. We then evaluate prompt‐based large language models (namely Claude Opus 4.0) in zero‐shot and few‐shot settings to understand how LLMs respond to the newly proposed unified labels. Finally, we introduce HiDAC, a Hierarchical Dual‐Adapter Contrastive learning model. Results show that while larger transformer models achieve higher accuracy, the improvements are modest, and that unfreezing the top 75% of encoder layers yields performance comparable to full fine‐tuning while training far fewer parameters. Prompt‐based models lag significantly behind fine‐tuned transformers, and HiDAC achieves the highest overall accuracy (67.5%) while remaining more parameter‐efficient than full fine‐tuning.
Anthology ID:
2025.disrpt-1.3
Volume:
Proceedings of the 4th Shared Task on Discourse Relation Parsing and Treebanking (DISRPT 2025)
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Chloé Braud, Yang Janet Liu, Philippe Muller, Amir Zeldes, Chuyuan Li
Venues:
DISRPT | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
36–47
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.disrpt-1.3/
DOI:
Bibkey:
Cite (ACL):
Nawar Turk, Daniele Comitogianni, and Leila Kosseim. 2025. CLaC at DISRPT 2025: Hierarchical Adapters for Cross-Framework Multi-lingual Discourse Relation Classification. In Proceedings of the 4th Shared Task on Discourse Relation Parsing and Treebanking (DISRPT 2025), pages 36–47, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
CLaC at DISRPT 2025: Hierarchical Adapters for Cross-Framework Multi-lingual Discourse Relation Classification (Turk et al., DISRPT 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.disrpt-1.3.pdf
Supplementarymaterial:
 2025.disrpt-1.3.SupplementaryMaterial.zip