Abstract
Biaffine method is a strong and efficient method for graph-based dependency parsing. However, previous work only used the biaffine method at the end of the dependency parser as a scorer, and its application in multi-layer form is ignored. In this paper, we propose a multi-layer pseudo-Siamese biaffine model for neural dependency parsing. In this model, we modify the biaffine method so that it can be utilized in multi-layer form, and use pseudo-Siamese biaffine module to construct arc weight matrix for final prediction. In our proposed multi-layer architecture, the biaffine method plays important roles in both scorer and attention mechanism at the same time in each layer. We evaluate our model on PTB, CTB, and UD. The model achieves state-of-the-art results on these datasets. Further experiments show the benefits of introducing multi-layer form and pseudo-Siamese module into the biaffine method with low efficiency loss.- Anthology ID:
- 2022.coling-1.486
- Volume:
- Proceedings of the 29th International Conference on Computational Linguistics
- Month:
- October
- Year:
- 2022
- Address:
- Gyeongju, Republic of Korea
- Editors:
- Nicoletta Calzolari, Chu-Ren Huang, Hansaem Kim, James Pustejovsky, Leo Wanner, Key-Sun Choi, Pum-Mo Ryu, Hsin-Hsi Chen, Lucia Donatelli, Heng Ji, Sadao Kurohashi, Patrizia Paggio, Nianwen Xue, Seokhwan Kim, Younggyun Hahm, Zhong He, Tony Kyungil Lee, Enrico Santus, Francis Bond, Seung-Hoon Na
- Venue:
- COLING
- SIG:
- Publisher:
- International Committee on Computational Linguistics
- Note:
- Pages:
- 5476–5487
- Language:
- URL:
- https://aclanthology.org/2022.coling-1.486
- DOI:
- Cite (ACL):
- Ziyao Xu, Houfeng Wang, and Bingdong Wang. 2022. Multi-Layer Pseudo-Siamese Biaffine Model for Dependency Parsing. In Proceedings of the 29th International Conference on Computational Linguistics, pages 5476–5487, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
- Cite (Informal):
- Multi-Layer Pseudo-Siamese Biaffine Model for Dependency Parsing (Xu et al., COLING 2022)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-5/2022.coling-1.486.pdf
- Code
- xzy-xzy/mlpsb-parser