Recurrent Neural Network CCG Parser

Sora Tagami, Daisuke Bekki


Abstract
The two contrasting approaches are end-to-end neural NLI systems and linguistically-oriented NLI pipelines consisting of modules such as neural CCG parsers and theorem provers. The latter, however, faces the challenge of integrating the neural models used in the syntactic and semantic components. RNNGs are frameworks that can potentially fill this gap, but conventional RNNGs adopt CFG as the syntactic theory. To address this issue, we implemented RNN-CCG, a syntactic parser that replaces CFG with CCG. We then conducted experiments comparing RNN-CCG to RNNGs with/without POS tags and evaluated their behavior as a first step towards building an NLI system based on RNN-CCG.
Anthology ID:
2023.naloma-1.4
Volume:
Proceedings of the 4th Natural Logic Meets Machine Learning Workshop
Month:
June
Year:
2023
Address:
Nancy, France
Editors:
Stergios Chatzikyriakidis, Valeria de Paiva
Venues:
NALOMA | WS
SIG:
SIGSEM
Publisher:
Association for Computational Linguistics
Note:
Pages:
35–40
Language:
URL:
https://aclanthology.org/2023.naloma-1.4
DOI:
Bibkey:
Cite (ACL):
Sora Tagami and Daisuke Bekki. 2023. Recurrent Neural Network CCG Parser. In Proceedings of the 4th Natural Logic Meets Machine Learning Workshop, pages 35–40, Nancy, France. Association for Computational Linguistics.
Cite (Informal):
Recurrent Neural Network CCG Parser (Tagami & Bekki, NALOMA-WS 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-2/2023.naloma-1.4.pdf