@inproceedings{dey-etal-2025-mathtt,
    title = "{GeLLM{\textthreesuperior}O}: Generalizing Large Language Models for Multi-property Molecule Optimization",
    author = "Dey, Vishal  and
      Hu, Xiao  and
      Ning, Xia",
    editor = "Che, Wanxiang  and
      Nabende, Joyce  and
      Shutova, Ekaterina  and
      Pilehvar, Mohammad Taher",
    booktitle = "Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
    month = jul,
    year = "2025",
    address = "Vienna, Austria",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2025.acl-long.1225/",
    doi = "10.18653/v1/2025.acl-long.1225",
    pages = "25192--25221",
    ISBN = "979-8-89176-251-0",
    abstract = "Despite recent advancements, most computational methods for molecule optimization are constrained to single- or double-property optimization tasks and suffer from poor scalability and generalizability to novel optimization tasks. Meanwhile, Large Language Models (LLMs) demonstrate remarkable out-of-domain generalizability to novel tasks. To demonstrate LLMs' potential for molecule optimization, we introduce $\mathtt{MuMOInstruct}$, the first high-quality instruction-tuning dataset specifically focused on multi-property molecule optimization tasks. Leveraging $\mathtt{MuMOInstruct}$, we develop $\mathtt{GeLLM^3O}$s, a series of instruction-tuned LLMs for molecule optimization. Extensive evaluations across 5 in-domain and 5 out-of-domain tasks demonstrate that $\mathtt{GeLLM^3O}$s consistently outperform state-of-the-art baselines. $\mathtt{GeLLM^3O}$s also exhibit outstanding zero-shot generalization to unseen tasks, significantly outperforming powerful closed-source LLMs. Such strong generalizability demonstrates the tremendous potential of $\mathtt{GeLLM^3O}$s as foundational models for molecule optimization, thereby tackling novel optimization tasks without resource-intensive retraining. $\mathtt{MuMOInstruct}$ and code are accessible through https://github.com/ninglab/GeLLMO."
}Markdown (Informal)
[GeLLM³O: Generalizing Large Language Models for Multi-property Molecule Optimization](https://preview.aclanthology.org/ingest-emnlp/2025.acl-long.1225/) (Dey et al., ACL 2025)
ACL