Syntax-Enhanced Self-Attention-Based Semantic Role Labeling

Yue Zhang, Rui Wang, Luo Si


Abstract
As a fundamental NLP task, semantic role labeling (SRL) aims to discover the semantic roles for each predicate within one sentence. This paper investigates how to incorporate syntactic knowledge into the SRL task effectively. We present different approaches of en- coding the syntactic information derived from dependency trees of different quality and representations; we propose a syntax-enhanced self-attention model and compare it with other two strong baseline methods; and we con- duct experiments with newly published deep contextualized word representations as well. The experiment results demonstrate that with proper incorporation of the high quality syntactic information, our model achieves a new state-of-the-art performance for the Chinese SRL task on the CoNLL-2009 dataset.
Anthology ID:
D19-1057
Volume:
Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)
Month:
November
Year:
2019
Address:
Hong Kong, China
Editors:
Kentaro Inui, Jing Jiang, Vincent Ng, Xiaojun Wan
Venues:
EMNLP | IJCNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
616–626
Language:
URL:
https://aclanthology.org/D19-1057
DOI:
10.18653/v1/D19-1057
Bibkey:
Cite (ACL):
Yue Zhang, Rui Wang, and Luo Si. 2019. Syntax-Enhanced Self-Attention-Based Semantic Role Labeling. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pages 616–626, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
Syntax-Enhanced Self-Attention-Based Semantic Role Labeling (Zhang et al., EMNLP-IJCNLP 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp22-frontmatter/D19-1057.pdf