mLongT5: A Multilingual and Efficient Text-To-Text Transformer for Longer Sequences

David Uthus, Santiago Ontanon, Joshua Ainslie, Mandy Guo


Abstract
We present our work on developing a multilingual, efficient text-to-text transformer that is suitable for handling long inputs. This model, called mLongT5, builds upon the architecture of LongT5, while leveraging the multilingual datasets used for pretraining mT5 and the pretraining tasks of UL2. We evaluate this model on a variety of multilingual summarization and question-answering tasks, and the results show stronger performance for mLongT5 when compared to existing multilingual models such as mBART or M-BERT.
Anthology ID:
2023.findings-emnlp.628
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9380–9386
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.628
DOI:
10.18653/v1/2023.findings-emnlp.628
Bibkey:
Cite (ACL):
David Uthus, Santiago Ontanon, Joshua Ainslie, and Mandy Guo. 2023. mLongT5: A Multilingual and Efficient Text-To-Text Transformer for Longer Sequences. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 9380–9386, Singapore. Association for Computational Linguistics.
Cite (Informal):
mLongT5: A Multilingual and Efficient Text-To-Text Transformer for Longer Sequences (Uthus et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/2023.findings-emnlp.628.pdf