Word Order Matters When You Increase Masking

Karim Lasri, Alessandro Lenci, Thierry Poibeau


Abstract
Word order, an essential property of natural languages, is injected in Transformer-based neural language models using position encoding. However, recent experiments have shown that explicit position encoding is not always useful, since some models without such feature managed to achieve state-of-the art performance on some tasks. To understand better this phenomenon, we examine the effect of removing position encodings on the pre-training objective itself (i.e., masked language modelling), to test whether models can reconstruct position information from co-occurrences alone. We do so by controlling the amount of masked tokens in the input sentence, as a proxy to affect the importance of position information for the task. We find that the necessity of position information increases with the amount of masking, and that masked language models without position encodings are not able to reconstruct this information on the task. These findings point towards a direct relationship between the amount of masking and the ability of Transformers to capture order-sensitive aspects of language using position encoding.
Anthology ID:
2022.emnlp-main.118
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1808–1815
Language:
URL:
https://aclanthology.org/2022.emnlp-main.118
DOI:
10.18653/v1/2022.emnlp-main.118
Bibkey:
Cite (ACL):
Karim Lasri, Alessandro Lenci, and Thierry Poibeau. 2022. Word Order Matters When You Increase Masking. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 1808–1815, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Word Order Matters When You Increase Masking (Lasri et al., EMNLP 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/2022.emnlp-main.118.pdf