Multi-Task Neural Models for Translating Between Styles Within and Across Languages

Xing Niu, Sudha Rao, Marine Carpuat


Abstract
Generating natural language requires conveying content in an appropriate style. We explore two related tasks on generating text of varying formality: monolingual formality transfer and formality-sensitive machine translation. We propose to solve these tasks jointly using multi-task learning, and show that our models achieve state-of-the-art performance for formality transfer and are able to perform formality-sensitive translation without being explicitly trained on style-annotated translation examples.
Anthology ID:
C18-1086
Volume:
Proceedings of the 27th International Conference on Computational Linguistics
Month:
August
Year:
2018
Address:
Santa Fe, New Mexico, USA
Editors:
Emily M. Bender, Leon Derczynski, Pierre Isabelle
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1008–1021
Language:
URL:
https://aclanthology.org/C18-1086
DOI:
Bibkey:
Cite (ACL):
Xing Niu, Sudha Rao, and Marine Carpuat. 2018. Multi-Task Neural Models for Translating Between Styles Within and Across Languages. In Proceedings of the 27th International Conference on Computational Linguistics, pages 1008–1021, Santa Fe, New Mexico, USA. Association for Computational Linguistics.
Cite (Informal):
Multi-Task Neural Models for Translating Between Styles Within and Across Languages (Niu et al., COLING 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-2024-clasp/C18-1086.pdf
Code
 xingniu/multitask-ft-fsmt
Data
GYAFC