Generic Mechanism for Reducing Repetitions in Encoder-Decoder Models

Ying Zhang, Hidetaka Kamigaito, Tatsuya Aoki, Hiroya Takamura, Manabu Okumura


Abstract
Encoder-decoder models have been commonly used for many tasks such as machine translation and response generation. As previous research reported, these models suffer from generating redundant repetition. In this research, we propose a new mechanism for encoder-decoder models that estimates the semantic difference of a source sentence before and after being fed into the encoder-decoder model to capture the consistency between two sides. This mechanism helps reduce repeatedly generated tokens for a variety of tasks. Evaluation results on publicly available machine translation and response generation datasets demonstrate the effectiveness of our proposal.
Anthology ID:
2021.ranlp-1.180
Volume:
Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2021)
Month:
September
Year:
2021
Address:
Held Online
Venue:
RANLP
SIG:
Publisher:
INCOMA Ltd.
Note:
Pages:
1606–1615
Language:
URL:
https://aclanthology.org/2021.ranlp-1.180
DOI:
Bibkey:
Cite (ACL):
Ying Zhang, Hidetaka Kamigaito, Tatsuya Aoki, Hiroya Takamura, and Manabu Okumura. 2021. Generic Mechanism for Reducing Repetitions in Encoder-Decoder Models. In Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2021), pages 1606–1615, Held Online. INCOMA Ltd..
Cite (Informal):
Generic Mechanism for Reducing Repetitions in Encoder-Decoder Models (Zhang et al., RANLP 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/update-css-js/2021.ranlp-1.180.pdf
Data
PERSONA-CHAT