Comparison of the effects of attention mechanism on translation tasks of different lengths of ambiguous words

Yue Hu, Jiahao Qin, Zemeiqi Chen, Jingshi Zhou, Xiaojun Zhang


Abstract
In recent years, attention mechanism has been widely used in various neural machine translation tasks based on encoder decoder. This paper focuses on the performance of encoder decoder attention mechanism in word sense disambiguation task with different text length, trying to find out the influence of context marker on attention mechanism in word sense disambiguation task. We hypothesize that attention mechanisms have similar performance when translating texts of different lengths. Our conclusion is that the alignment effect of attention mechanism is magnified in short text translation tasks with ambiguous nouns, while the effect of attention mechanism is far less than expected in long-text tasks, which means that attention mechanism is not the main mechanism for NMT model to feed WSD to integrate context information. This may mean that attention mechanism pays more attention to ambiguous nouns than context markers. The experimental results show that with the increase of text length, the performance of NMT model using attention mechanism will gradually decline.
Anthology ID:
2020.iwdp-1.4
Volume:
Proceedings of the Second International Workshop of Discourse Processing
Month:
December
Year:
2020
Address:
Suzhou, China
Venues:
AACL | iwdp
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
18–21
Language:
URL:
https://aclanthology.org/2020.iwdp-1.4
DOI:
Bibkey:
Cite (ACL):
Yue Hu, Jiahao Qin, Zemeiqi Chen, Jingshi Zhou, and Xiaojun Zhang. 2020. Comparison of the effects of attention mechanism on translation tasks of different lengths of ambiguous words. In Proceedings of the Second International Workshop of Discourse Processing, pages 18–21, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Comparison of the effects of attention mechanism on translation tasks of different lengths of ambiguous words (Hu et al., iwdp 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/update-css-js/2020.iwdp-1.4.pdf