Abstract
This work experiments with various configurations of transformer-based sequence-to-sequence neural networks in training a Discourse Representation Structure (DRS) parser, and presents the results along with the code to reproduce our experiments for use by the community working on DRS parsing. These are configurations that have not been tested in prior work on this task. The Parallel Meaning Bank (PMB) English data sets are used to train the models. The results are evaluated on the PMB test sets using Counter, the standard Evaluation tool for DRSs. We show that the performance improves upon the previous state of the art by 0.5 (F1 %) for PMB 2.2.0 and 1.02 (F1 %) for PMB 3.0.0 test sets. We also present results on PMB 4.0.0, which has not been evaluated using Counter in previous research.- Anthology ID:
- 2023.iwcs-1.9
- Volume:
- Proceedings of the 15th International Conference on Computational Semantics
- Month:
- June
- Year:
- 2023
- Address:
- Nancy, France
- Editors:
- Maxime Amblard, Ellen Breitholtz
- Venue:
- IWCS
- SIG:
- SIGSEM
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 83–88
- Language:
- URL:
- https://aclanthology.org/2023.iwcs-1.9
- DOI:
- Cite (ACL):
- Ahmet Yildirim and Dag Haug. 2023. Experiments in training transformer sequence-to-sequence DRS parsers. In Proceedings of the 15th International Conference on Computational Semantics, pages 83–88, Nancy, France. Association for Computational Linguistics.
- Cite (Informal):
- Experiments in training transformer sequence-to-sequence DRS parsers (Yildirim & Haug, IWCS 2023)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-4/2023.iwcs-1.9.pdf