Unveiling Environmental Impacts of Large Language Model Serving: A Functional Unit View

Yanran Wu, Inez Hua, Yi Ding


Abstract
Large language models (LLMs) offer powerful capabilities but come with significant environmental impact, particularly in carbon emissions. Existing studies benchmark carbon emissions but lack a standardized basis for comparison across different model configurations. To address this, we introduce the concept of functional unit (FU) as a standardized basis and develop FUEL, the first FU-based framework for evaluating LLM serving’s environmental impact. Through three case studies, we uncover key insights and trade-offs in reducing carbon emissions by optimizing model size, quantization strategy, and hardware choice, paving the way for more sustainable LLM serving. The code is available at https://github.com/jojacola/FUEL.
Anthology ID:
2025.acl-long.519
Volume:
Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10560–10576
Language:
URL:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.519/
DOI:
Bibkey:
Cite (ACL):
Yanran Wu, Inez Hua, and Yi Ding. 2025. Unveiling Environmental Impacts of Large Language Model Serving: A Functional Unit View. In Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 10560–10576, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Unveiling Environmental Impacts of Large Language Model Serving: A Functional Unit View (Wu et al., ACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.519.pdf