A Side-by-side Comparison of Transformers for Implicit Discourse Relation Classification

Bruce W. Lee, Bongseok Yang, Jason Lee


Abstract
Though discourse parsing can help multiple NLP fields, there has been no wide language model search done on implicit discourse relation classification. This hinders researchers from fully utilizing public-available models in discourse analysis. This work is a straightforward, fine-tuned discourse performance comparison of 7 pre-trained language models. We use PDTB-3, a popular discourse relation annotated dataset. Through our model search, we raise SOTA to 0.671 ACC and obtain novel observations. Some are contrary to what has been reported before (Shi and Demberg, 2019b), that sentence-level pre-training objectives (NSP, SBO, SOP) generally fail to produce the best-performing model for implicit discourse relation classification. Counterintuitively, similar-sized PLMs with MLM and full attention led to better performance. Our code is publicly released.
Anthology ID:
2023.codi-1.2
Volume:
Proceedings of the 4th Workshop on Computational Approaches to Discourse (CODI 2023)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Michael Strube, Chloe Braud, Christian Hardmeier, Junyi Jessy Li, Sharid Loaiciga, Amir Zeldes
Venue:
CODI
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
16–23
Language:
URL:
https://aclanthology.org/2023.codi-1.2
DOI:
10.18653/v1/2023.codi-1.2
Bibkey:
Cite (ACL):
Bruce W. Lee, Bongseok Yang, and Jason Lee. 2023. A Side-by-side Comparison of Transformers for Implicit Discourse Relation Classification. In Proceedings of the 4th Workshop on Computational Approaches to Discourse (CODI 2023), pages 16–23, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
A Side-by-side Comparison of Transformers for Implicit Discourse Relation Classification (Lee et al., CODI 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2023.codi-1.2.pdf
Video:
 https://preview.aclanthology.org/emnlp-22-attachments/2023.codi-1.2.mp4