Abstract
As scientific research proliferates, researchers face the daunting task of navigating and reading vast amounts of literature. Existing solutions, such as document QA, fail to provide personalized and up-to-date information efficiently. We present Arxiv Copilot, a self-evolving, efficient LLM system designed to assist researchers, based on thought-retrieval, user profile and high performance optimization. Specifically, Arxiv Copilot can offer personalized research services, maintaining a real-time updated database. Quantitative evaluation demonstrates that Arxiv Copilot saves 69.92% of time after efficient deployment. This paper details the design and implementation of Arxiv Copilot, highlighting its contributions to personalized academic support and its potential to streamline the research process. We have deployed Arxiv Copilot at: https://huggingface.co/spaces/ulab-ai/ArxivCopilot.- Anthology ID:
- 2024.emnlp-demo.13
- Volume:
- Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing: System Demonstrations
- Month:
- November
- Year:
- 2024
- Address:
- Miami, Florida, USA
- Editors:
- Delia Irazu Hernandez Farias, Tom Hope, Manling Li
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 122–130
- Language:
- URL:
- https://aclanthology.org/2024.emnlp-demo.13
- DOI:
- 10.18653/v1/2024.emnlp-demo.13
- Cite (ACL):
- Guanyu Lin, Tao Feng, Pengrui Han, Ge Liu, and Jiaxuan You. 2024. Arxiv Copilot: A Self-Evolving and Efficient LLM System for Personalized Academic Assistance. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing: System Demonstrations, pages 122–130, Miami, Florida, USA. Association for Computational Linguistics.
- Cite (Informal):
- Arxiv Copilot: A Self-Evolving and Efficient LLM System for Personalized Academic Assistance (Lin et al., EMNLP 2024)
- PDF:
- https://preview.aclanthology.org/dois-2013-emnlp/2024.emnlp-demo.13.pdf