Efficient GPT Model Pre-training using Tensor Train Matrix Representation

Viktoriia Chekalina, Georgiy Novikov, Julia Gusak, Alexander Panchenko, Ivan Oseledets


Anthology ID:
2023.paclic-1.60
Volume:
Proceedings of the 37th Pacific Asia Conference on Language, Information and Computation
Month:
December
Year:
2023
Address:
Hong Kong, China
Editors:
Chu-Ren Huang, Yasunari Harada, Jong-Bok Kim, Si Chen, Yu-Yin Hsu, Emmanuele Chersoni, Pranav A, Winnie Huiheng Zeng, Bo Peng, Yuxi Li, Junlin Li
Venue:
PACLIC
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
600–608
Language:
URL:
https://aclanthology.org/2023.paclic-1.60
DOI:
Bibkey:
Cite (ACL):
Viktoriia Chekalina, Georgiy Novikov, Julia Gusak, Alexander Panchenko, and Ivan Oseledets. 2023. Efficient GPT Model Pre-training using Tensor Train Matrix Representation. In Proceedings of the 37th Pacific Asia Conference on Language, Information and Computation, pages 600–608, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
Efficient GPT Model Pre-training using Tensor Train Matrix Representation (Chekalina et al., PACLIC 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-2/2023.paclic-1.60.pdf