Encoding and Fusing Semantic Connection and Linguistic Evidence for Implicit Discourse Relation Recognition

Wei Xiang, Bang Wang, Lu Dai, Yijun Mo


Abstract
Prior studies use one attention mechanism to improve contextual semantic representation learning for implicit discourse relation recognition (IDRR). However, diverse relation senses may benefit from different attention mechanisms. We also argue that some linguistic relation in between two words can be further exploited for IDRR. This paper proposes a Multi-Attentive Neural Fusion (MANF) model to encode and fuse both semantic connection and linguistic evidence for IDRR. In MANF, we design a Dual Attention Network (DAN) to learn and fuse two kinds of attentive representation for arguments as its semantic connection. We also propose an Offset Matrix Network (OMN) to encode the linguistic relations of word-pairs as linguistic evidence. Our MANF model achieves the state-of-the-art results on the PDTB 3.0 corpus.
Anthology ID:
2022.findings-acl.256
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3247–3257
Language:
URL:
https://aclanthology.org/2022.findings-acl.256
DOI:
10.18653/v1/2022.findings-acl.256
Bibkey:
Cite (ACL):
Wei Xiang, Bang Wang, Lu Dai, and Yijun Mo. 2022. Encoding and Fusing Semantic Connection and Linguistic Evidence for Implicit Discourse Relation Recognition. In Findings of the Association for Computational Linguistics: ACL 2022, pages 3247–3257, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Encoding and Fusing Semantic Connection and Linguistic Evidence for Implicit Discourse Relation Recognition (Xiang et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/improve-issue-templates/2022.findings-acl.256.pdf
Code
 hustminslab/manf