Abstract
Supertagging is conventionally regarded as an important task for combinatory categorial grammar (CCG) parsing, where effective modeling of contextual information is highly important to this task. However, existing studies have made limited efforts to leverage contextual features except for applying powerful encoders (e.g., bi-LSTM). In this paper, we propose attentive graph convolutional networks to enhance neural CCG supertagging through a novel solution of leveraging contextual information. Specifically, we build the graph from chunks (n-grams) extracted from a lexicon and apply attention over the graph, so that different word pairs from the contexts within and across chunks are weighted in the model and facilitate the supertagging accordingly. The experiments performed on the CCGbank demonstrate that our approach outperforms all previous studies in terms of both supertagging and parsing. Further analyses illustrate the effectiveness of each component in our approach to discriminatively learn from word pairs to enhance CCG supertagging.- Anthology ID:
- 2020.emnlp-main.487
- Original:
- 2020.emnlp-main.487v1
- Version 2:
- 2020.emnlp-main.487v2
- Volume:
- Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
- Month:
- November
- Year:
- 2020
- Address:
- Online
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 6037–6044
- Language:
- URL:
- https://aclanthology.org/2020.emnlp-main.487
- DOI:
- 10.18653/v1/2020.emnlp-main.487
- Cite (ACL):
- Yuanhe Tian, Yan Song, and Fei Xia. 2020. Supertagging Combinatory Categorial Grammar with Attentive Graph Convolutional Networks. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 6037–6044, Online. Association for Computational Linguistics.
- Cite (Informal):
- Supertagging Combinatory Categorial Grammar with Attentive Graph Convolutional Networks (Tian et al., EMNLP 2020)
- PDF:
- https://preview.aclanthology.org/paclic-22-ingestion/2020.emnlp-main.487.pdf
- Code
- cuhksz-nlp/NeST-CCG
- Data
- CCGbank