Self-Attentional Models for Lattice Inputs
Matthias Sperber, Graham Neubig, Ngoc-Quan Pham, Alex Waibel
Abstract
Lattices are an efficient and effective method to encode ambiguity of upstream systems in natural language processing tasks, for example to compactly capture multiple speech recognition hypotheses, or to represent multiple linguistic analyses. Previous work has extended recurrent neural networks to model lattice inputs and achieved improvements in various tasks, but these models suffer from very slow computation speeds. This paper extends the recently proposed paradigm of self-attention to handle lattice inputs. Self-attention is a sequence modeling technique that relates inputs to one another by computing pairwise similarities and has gained popularity for both its strong results and its computational efficiency. To extend such models to handle lattices, we introduce probabilistic reachability masks that incorporate lattice structure into the model and support lattice scores if available. We also propose a method for adapting positional embeddings to lattice structures. We apply the proposed model to a speech translation task and find that it outperforms all examined baselines while being much faster to compute than previous neural lattice models during both training and inference.- Anthology ID:
- P19-1115
- Volume:
- Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
- Month:
- July
- Year:
- 2019
- Address:
- Florence, Italy
- Editors:
- Anna Korhonen, David Traum, Lluís Màrquez
- Venue:
- ACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 1185–1197
- Language:
- URL:
- https://aclanthology.org/P19-1115
- DOI:
- 10.18653/v1/P19-1115
- Cite (ACL):
- Matthias Sperber, Graham Neubig, Ngoc-Quan Pham, and Alex Waibel. 2019. Self-Attentional Models for Lattice Inputs. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 1185–1197, Florence, Italy. Association for Computational Linguistics.
- Cite (Informal):
- Self-Attentional Models for Lattice Inputs (Sperber et al., ACL 2019)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-5/P19-1115.pdf