Leveraging Local and Global Patterns for Self-Attention Networks

Mingzhou Xu, Derek F. Wong, Baosong Yang, Yue Zhang, Lidia S. Chao


Abstract
Self-attention networks have received increasing research attention. By default, the hidden states of each word are hierarchically calculated by attending to all words in the sentence, which assembles global information. However, several studies pointed out that taking all signals into account may lead to overlooking neighboring information (e.g. phrase pattern). To address this argument, we propose a hybrid attention mechanism to dynamically leverage both of the local and global information. Specifically, our approach uses a gating scalar for integrating both sources of the information, which is also convenient for quantifying their contributions. Experiments on various neural machine translation tasks demonstrate the effectiveness of the proposed method. The extensive analyses verify that the two types of contexts are complementary to each other, and our method gives highly effective improvements in their integration.
Anthology ID:
P19-1295
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Editors:
Anna Korhonen, David Traum, Lluís Màrquez
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3069–3075
Language:
URL:
https://aclanthology.org/P19-1295
DOI:
10.18653/v1/P19-1295
Bibkey:
Cite (ACL):
Mingzhou Xu, Derek F. Wong, Baosong Yang, Yue Zhang, and Lidia S. Chao. 2019. Leveraging Local and Global Patterns for Self-Attention Networks. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 3069–3075, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Leveraging Local and Global Patterns for Self-Attention Networks (Xu et al., ACL 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-acl-2023-videos/P19-1295.pdf
Code
 scewiner/Leveraging