Textual Aesthetics in Large Language Models

Lingjie Jiang, Shaohan Huang, Xun Wu, Furu Wei


Abstract
Image aesthetics is a crucial metric in the field of image generation. However, textual aesthetics has not been sufficiently explored. With the widespread application of large language models (LLMs), previous work has primarily focused on the correctness of content and the helpfulness of responses. Nonetheless, providing responses with textual aesthetics is also an important factor for LLMs, which can offer a cleaner layout and ensure greater consistency and coherence in content. In this work, we introduce a pipeline for aesthetics polishing and help construct a textual aesthetics dataset named TEXAES. We propose a textual aesthetics-powered fine-tuning method based on direct preference optimization, termed TAPO, which leverages textual aesthetics without compromising content correctness. Additionally, we develop two evaluation methods for textual aesthetics based on text and image analysis, respectively.Our experiments demonstrate that using textual aesthetics data and employing the TAPO fine-tuning method not only improves aesthetic scores but also enhances performance on general evaluation datasets such as AlpacalEval and Arena-hard.
Anthology ID:
2025.emnlp-main.696
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13801–13829
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.696/
DOI:
Bibkey:
Cite (ACL):
Lingjie Jiang, Shaohan Huang, Xun Wu, and Furu Wei. 2025. Textual Aesthetics in Large Language Models. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 13801–13829, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Textual Aesthetics in Large Language Models (Jiang et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.696.pdf
Checklist:
 2025.emnlp-main.696.checklist.pdf