Linguistic realisation as machine translation: Comparing different MT models for AMR-to-text generation

Thiago Castro Ferreira, Iacer Calixto, Sander Wubben, Emiel Krahmer


Abstract
In this paper, we study AMR-to-text generation, framing it as a translation task and comparing two different MT approaches (Phrase-based and Neural MT). We systematically study the effects of 3 AMR preprocessing steps (Delexicalisation, Compression, and Linearisation) applied before the MT phase. Our results show that preprocessing indeed helps, although the benefits differ for the two MT models.
Anthology ID:
W17-3501
Volume:
Proceedings of the 10th International Conference on Natural Language Generation
Month:
September
Year:
2017
Address:
Santiago de Compostela, Spain
Editors:
Jose M. Alonso, Alberto Bugarín, Ehud Reiter
Venue:
INLG
SIG:
SIGGEN
Publisher:
Association for Computational Linguistics
Note:
Pages:
1–10
Language:
URL:
https://aclanthology.org/W17-3501
DOI:
10.18653/v1/W17-3501
Bibkey:
Cite (ACL):
Thiago Castro Ferreira, Iacer Calixto, Sander Wubben, and Emiel Krahmer. 2017. Linguistic realisation as machine translation: Comparing different MT models for AMR-to-text generation. In Proceedings of the 10th International Conference on Natural Language Generation, pages 1–10, Santiago de Compostela, Spain. Association for Computational Linguistics.
Cite (Informal):
Linguistic realisation as machine translation: Comparing different MT models for AMR-to-text generation (Castro Ferreira et al., INLG 2017)
Copy Citation:
PDF:
https://preview.aclanthology.org/improve-issue-templates/W17-3501.pdf