Assessing the Ability of Self-Attention Networks to Learn Word Order

Baosong Yang, Longyue Wang, Derek F. Wong, Lidia S. Chao, Zhaopeng Tu


Abstract
Self-attention networks (SAN) have attracted a lot of interests due to their high parallelization and strong performance on a variety of NLP tasks, e.g. machine translation. Due to the lack of recurrence structure such as recurrent neural networks (RNN), SAN is ascribed to be weak at learning positional information of words for sequence modeling. However, neither this speculation has been empirically confirmed, nor explanations for their strong performances on machine translation tasks when “lacking positional information” have been explored. To this end, we propose a novel word reordering detection task to quantify how well the word order information learned by SAN and RNN. Specifically, we randomly move one word to another position, and examine whether a trained model can detect both the original and inserted positions. Experimental results reveal that: 1) SAN trained on word reordering detection indeed has difficulty learning the positional information even with the position embedding; and 2) SAN trained on machine translation learns better positional information than its RNN counterpart, in which position embedding plays a critical role. Although recurrence structure make the model more universally-effective on learning word order, learning objectives matter more in the downstream tasks such as machine translation.
Anthology ID:
P19-1354
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3635–3644
Language:
URL:
https://aclanthology.org/P19-1354
DOI:
10.18653/v1/P19-1354
Bibkey:
Cite (ACL):
Baosong Yang, Longyue Wang, Derek F. Wong, Lidia S. Chao, and Zhaopeng Tu. 2019. Assessing the Ability of Self-Attention Networks to Learn Word Order. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 3635–3644, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Assessing the Ability of Self-Attention Networks to Learn Word Order (Yang et al., ACL 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/P19-1354.pdf
Video:
 https://vimeo.com/384961600
Code
 baosongyang/WRD