Linguistic Frameworks Go Toe-to-Toe at Neuro-Symbolic Language Modeling

Jakob Prange, Nathan Schneider, Lingpeng Kong


Abstract
We examine the extent to which, in principle, different syntactic and semantic graph representations can complement and improve neural language modeling. Specifically, by conditioning on a subgraph encapsulating the locally relevant sentence history, can a model make better next-word predictions than a pretrained sequential language model alone? With an ensemble setup consisting of GPT-2 and ground-truth graphs from one of 7 different formalisms, we find that the graph information indeed improves perplexity and other metrics. Moreover, this architecture provides a new way to compare different frameworks of linguistic representation. In our oracle graph setup, training and evaluating on English WSJ, semantic constituency structures prove most useful to language modeling performance—outpacing syntactic constituency structures as well as syntactic and semantic dependency structures.
Anthology ID:
2022.naacl-main.325
Volume:
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
July
Year:
2022
Address:
Seattle, United States
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4375–4391
Language:
URL:
https://aclanthology.org/2022.naacl-main.325
DOI:
10.18653/v1/2022.naacl-main.325
Bibkey:
Cite (ACL):
Jakob Prange, Nathan Schneider, and Lingpeng Kong. 2022. Linguistic Frameworks Go Toe-to-Toe at Neuro-Symbolic Language Modeling. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 4375–4391, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
Linguistic Frameworks Go Toe-to-Toe at Neuro-Symbolic Language Modeling (Prange et al., NAACL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/auto-file-uploads/2022.naacl-main.325.pdf
Video:
 https://preview.aclanthology.org/auto-file-uploads/2022.naacl-main.325.mp4
Code
 jakpra/linguisticstructurelm
Data
Penn TreebankUniversal Dependencies