Improving Constituency Parsing with Span Attention

Yuanhe Tian, Yan Song, Fei Xia, Tong Zhang


Abstract
Constituency parsing is a fundamental and important task for natural language understanding, where a good representation of contextual information can help this task. N-grams, which is a conventional type of feature for contextual information, have been demonstrated to be useful in many tasks, and thus could also be beneficial for constituency parsing if they are appropriately modeled. In this paper, we propose span attention for neural chart-based constituency parsing to leverage n-gram information. Considering that current chart-based parsers with Transformer-based encoder represent spans by subtraction of the hidden states at the span boundaries, which may cause information loss especially for long spans, we incorporate n-grams into span representations by weighting them according to their contributions to the parsing process. Moreover, we propose categorical span attention to further enhance the model by weighting n-grams within different length categories, and thus benefit long-sentence parsing. Experimental results on three widely used benchmark datasets demonstrate the effectiveness of our approach in parsing Arabic, Chinese, and English, where state-of-the-art performance is obtained by our approach on all of them.
Anthology ID:
2020.findings-emnlp.153
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2020
Month:
November
Year:
2020
Address:
Online
Editors:
Trevor Cohn, Yulan He, Yang Liu
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1691–1703
Language:
URL:
https://aclanthology.org/2020.findings-emnlp.153
DOI:
10.18653/v1/2020.findings-emnlp.153
Bibkey:
Cite (ACL):
Yuanhe Tian, Yan Song, Fei Xia, and Tong Zhang. 2020. Improving Constituency Parsing with Span Attention. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 1691–1703, Online. Association for Computational Linguistics.
Cite (Informal):
Improving Constituency Parsing with Span Attention (Tian et al., Findings 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/2020.findings-emnlp.153.pdf
Optional supplementary material:
 2020.findings-emnlp.153.OptionalSupplementaryMaterial.zip
Code
 cuhksz-nlp/SAPar
Data
Penn Treebank