2025
pdf
bib
abs
Evaluating the Long-Term Memory of Large Language Models
Zixi Jia
|
Qinghua Liu
|
Hexiao Li
|
Yuyan Chen
|
Jiqiang Liu
Findings of the Association for Computational Linguistics: ACL 2025
In applications such as dialogue systems, personalized recommendations, and personal assistants, large language models (LLMs) need to retain and utilize historical information over the long term to provide more accurate and consistent responses. Although long-term memory capability is crucial, recent studies have not thoroughly investigated the memory performance of large language models in long-term tasks. To address this gap, we introduce the Long-term Chronological Conversations (LOCCO) dataset and conduct a quantitative evaluation of the long-term memory capabilities of large language models. Experimental results demonstrate that large language models can retain past interaction information to a certain extent, but their memory decays over time. While rehearsal strategies can enhance memory persistence, excessive rehearsal is not an effective memory strategy for large models, unlike in smaller models. Additionally, the models exhibit memory preferences across different categories of information. Our study not only provides a new framework and dataset for evaluating the long-term memory capabilities of large language models but also offers important references for future enhancements of their memory persistence.
2019
pdf
bib
abs
The NiuTrans Machine Translation Systems for WMT19
Bei Li
|
Yinqiao Li
|
Chen Xu
|
Ye Lin
|
Jiqiang Liu
|
Hui Liu
|
Ziyang Wang
|
Yuhao Zhang
|
Nuo Xu
|
Zeyang Wang
|
Kai Feng
|
Hexuan Chen
|
Tengbo Liu
|
Yanyang Li
|
Qiang Wang
|
Tong Xiao
|
Jingbo Zhu
Proceedings of the Fourth Conference on Machine Translation (Volume 2: Shared Task Papers, Day 1)
This paper described NiuTrans neural machine translation systems for the WMT 2019 news translation tasks. We participated in 13 translation directions, including 11 supervised tasks, namely EN↔{ZH, DE, RU, KK, LT}, GU→EN and the unsupervised DE↔CS sub-track. Our systems were built on Deep Transformer and several back-translation methods. Iterative knowledge distillation and ensemble+reranking were also employed to obtain stronger models. Our unsupervised submissions were based on NMT enhanced by SMT. As a result, we achieved the highest BLEU scores in {KK↔EN, GU→EN} directions, ranking 2nd in {RU→EN, DE↔CS} and 3rd in {ZH→EN, LT→EN, EN→RU, EN↔DE} among all constrained submissions.
2018
pdf
bib
abs
The NiuTrans Machine Translation System for WMT18
Qiang Wang
|
Bei Li
|
Jiqiang Liu
|
Bojian Jiang
|
Zheyang Zhang
|
Yinqiao Li
|
Ye Lin
|
Tong Xiao
|
Jingbo Zhu
Proceedings of the Third Conference on Machine Translation: Shared Task Papers
This paper describes the submission of the NiuTrans neural machine translation system for the WMT 2018 Chinese ↔ English news translation tasks. Our baseline systems are based on the Transformer architecture. We further improve the translation performance 2.4-2.6 BLEU points from four aspects, including architectural improvements, diverse ensemble decoding, reranking, and post-processing. Among constrained submissions, we rank 2nd out of 16 submitted systems on Chinese → English task and 3rd out of 16 on English → Chinese task, respectively.