Abstract
Recent studies show that integrating syntactic tree models with sequential semantic models can bring improved task performance, while these methods mostly employ shallow integration of syntax and semantics. In this paper, we propose a deep neural communication model between syntax and semantics to improve the performance of text understanding. Local communication is performed between syntactic tree encoder and sequential semantic encoder for mutual learning of information exchange. Global communication can further ensure comprehensive information propagation. Results on multiple syntax-dependent tasks show that our model outperforms strong baselines by a large margin. In-depth analysis indicates that our method is highly effective in composing sentence semantics.- Anthology ID:
- 2020.findings-emnlp.8
- Volume:
- Findings of the Association for Computational Linguistics: EMNLP 2020
- Month:
- November
- Year:
- 2020
- Address:
- Online
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 84–93
- Language:
- URL:
- https://aclanthology.org/2020.findings-emnlp.8
- DOI:
- 10.18653/v1/2020.findings-emnlp.8
- Cite (ACL):
- Hao Fei, Yafeng Ren, and Donghong Ji. 2020. Improving Text Understanding via Deep Syntax-Semantics Communication. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 84–93, Online. Association for Computational Linguistics.
- Cite (Informal):
- Improving Text Understanding via Deep Syntax-Semantics Communication (Fei et al., Findings 2020)
- PDF:
- https://preview.aclanthology.org/nodalida-main-page/2020.findings-emnlp.8.pdf
- Data
- SICK