Abstract
Pre-trained language models such as BERT have achieved the state-of-the-art performance on natural language inference (NLI). However, it has been shown that such models can be tricked by variations of surface patterns such as syntax. We investigate the use of dependency trees to enhance the generalization of BERT in the NLI task, leveraging on a graph convolutional network to represent a syntax-based matching graph with heterogeneous matching patterns. Experimental results show that, our syntax-based method largely enhance generalization of BERT on a test set where the sentence pair has high lexical overlap but diverse syntactic structures, and do not degrade performance on the standard test set. In other words, the proposed method makes BERT more robust on syntactic changes.- Anthology ID:
- 2020.findings-emnlp.447
- Volume:
- Findings of the Association for Computational Linguistics: EMNLP 2020
- Month:
- November
- Year:
- 2020
- Address:
- Online
- Editors:
- Trevor Cohn, Yulan He, Yang Liu
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 4973–4978
- Language:
- URL:
- https://preview.aclanthology.org/build-pipeline-with-new-library/2020.findings-emnlp.447/
- DOI:
- 10.18653/v1/2020.findings-emnlp.447
- Cite (ACL):
- Qi He, Han Wang, and Yue Zhang. 2020. Enhancing Generalization in Natural Language Inference by Syntax. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 4973–4978, Online. Association for Computational Linguistics.
- Cite (Informal):
- Enhancing Generalization in Natural Language Inference by Syntax (He et al., Findings 2020)
- PDF:
- https://preview.aclanthology.org/build-pipeline-with-new-library/2020.findings-emnlp.447.pdf
- Code
- heqi2015/ca_gcn
- Data
- GLUE, MultiNLI