LAVIS: A One-stop Library for Language-Vision Intelligence
Dongxu Li, Junnan Li, Hung Le, Guangsen Wang, Silvio Savarese, Steven C.H. Hoi
Abstract
We introduce LAVIS, an open-source deep learning library for LAnguage-VISion research and applications. LAVIS aims to serve as a one-stop comprehensive library that brings recent advancements in the language-vision field accessible for researchers and practitioners, as well as fertilizing future research and development. It features a unified interface to easily access state-of-the-art image-language, video-language models and common datasets. LAVIS supports training, evaluation and benchmarking on a rich variety of tasks, including multimodal classification, retrieval, captioning, visual question answering, dialogue and pre-training. In the meantime, the library is also highly extensible and configurable, facilitating future development and customization. In this technical report, we describe design principles, key components and functionalities of the library, and also present benchmarking results across common language-vision tasks.- Anthology ID:
- 2023.acl-demo.3
- Volume:
- Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 3: System Demonstrations)
- Month:
- July
- Year:
- 2023
- Address:
- Toronto, Canada
- Editors:
- Danushka Bollegala, Ruihong Huang, Alan Ritter
- Venue:
- ACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 31–41
- Language:
- URL:
- https://aclanthology.org/2023.acl-demo.3
- DOI:
- 10.18653/v1/2023.acl-demo.3
- Cite (ACL):
- Dongxu Li, Junnan Li, Hung Le, Guangsen Wang, Silvio Savarese, and Steven C.H. Hoi. 2023. LAVIS: A One-stop Library for Language-Vision Intelligence. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 3: System Demonstrations), pages 31–41, Toronto, Canada. Association for Computational Linguistics.
- Cite (Informal):
- LAVIS: A One-stop Library for Language-Vision Intelligence (Li et al., ACL 2023)
- PDF:
- https://preview.aclanthology.org/landing_page/2023.acl-demo.3.pdf