Attention Strategies for Multi-Source Sequence-to-Sequence Learning

Jindřich Libovický, Jindřich Helcl


Abstract
Modeling attention in neural multi-source sequence-to-sequence learning remains a relatively unexplored area, despite its usefulness in tasks that incorporate multiple source languages or modalities. We propose two novel approaches to combine the outputs of attention mechanisms over each source sequence, flat and hierarchical. We compare the proposed methods with existing techniques and present results of systematic evaluation of those methods on the WMT16 Multimodal Translation and Automatic Post-editing tasks. We show that the proposed methods achieve competitive results on both tasks.
Anthology ID:
P17-2031
Volume:
Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
July
Year:
2017
Address:
Vancouver, Canada
Editors:
Regina Barzilay, Min-Yen Kan
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
196–202
Language:
URL:
https://aclanthology.org/P17-2031
DOI:
10.18653/v1/P17-2031
Bibkey:
Cite (ACL):
Jindřich Libovický and Jindřich Helcl. 2017. Attention Strategies for Multi-Source Sequence-to-Sequence Learning. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 196–202, Vancouver, Canada. Association for Computational Linguistics.
Cite (Informal):
Attention Strategies for Multi-Source Sequence-to-Sequence Learning (Libovický & Helcl, ACL 2017)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/P17-2031.pdf
Presentation:
 P17-2031.Presentation.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-4/P17-2031.mp4
Data
WMT 2016