A Framework for Bidirectional Decoding: Case Study in Morphological Inflection

Marc Canby, Julia Hockenmaier


Abstract
Transformer-based encoder-decoder models that generate outputs in a left-to-right fashion have become standard for sequence-to-sequence tasks. In this paper, we propose a framework for decoding that produces sequences from the “outside-in”: at each step, the model chooses to generate a token on the left, on the right, or join the left and right sequences. We argue that this is more principled than prior bidirectional decoders. Our proposal supports a variety of model architectures and includes several training methods, such as a dynamic programming algorithm that marginalizes out the latent ordering variable. Our model sets state-of-the-art (SOTA) on the 2022 and 2023 shared tasks, beating the next best systems by over 4.7 and 2.7 points in average accuracy respectively. The model performs particularly well on long sequences, can implicitly learn the split point of words composed of stem and affix, and performs better relative to the baseline on datasets that have fewer unique lemmas.
Anthology ID:
2023.findings-emnlp.297
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4485–4507
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.297
DOI:
10.18653/v1/2023.findings-emnlp.297
Bibkey:
Cite (ACL):
Marc Canby and Julia Hockenmaier. 2023. A Framework for Bidirectional Decoding: Case Study in Morphological Inflection. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 4485–4507, Singapore. Association for Computational Linguistics.
Cite (Informal):
A Framework for Bidirectional Decoding: Case Study in Morphological Inflection (Canby & Hockenmaier, Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/add_acl24_videos/2023.findings-emnlp.297.pdf
Video:
 https://preview.aclanthology.org/add_acl24_videos/2023.findings-emnlp.297.mp4