@inproceedings{park-etal-2025-bridging,
    title = "Bridging the Gap Between Molecule and Textual Descriptions via Substructure-aware Alignment",
    author = "Park, Hyuntae  and
      Kim, Yeachan  and
      Lee, SangKeun",
    editor = "Christodoulopoulos, Christos  and
      Chakraborty, Tanmoy  and
      Rose, Carolyn  and
      Peng, Violet",
    booktitle = "Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing",
    month = nov,
    year = "2025",
    address = "Suzhou, China",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1197/",
    pages = "23470--23490",
    ISBN = "979-8-89176-332-6",
    abstract = "Molecule and text representation learning has gained increasing interest due to its potential for enhancing the understanding of chemical information. However, existing models often struggle to capture subtle differences between molecules and their descriptions, as they lack the ability to learn fine-grained alignments between molecular substructures and chemical phrases. To address this limitation, we introduce MolBridge, a novel molecule{--}text learning framework based on substructure-aware alignments. Specifically, we augment the original molecule{--}description pairs with additional alignment signals derived from molecular substructures and chemical phrases. To effectively learn from these enriched alignments, MolBridge employs substructure-aware contrastive learning, coupled with a self-refinement mechanism that filters out noisy alignment signals. Experimental results show that MolBridge effectively captures fine-grained correspondences and outperforms state-of-the-art baselines on a wide range of molecular benchmarks, underscoring the importance of substructure-aware alignment in molecule-text learning."
}Markdown (Informal)
[Bridging the Gap Between Molecule and Textual Descriptions via Substructure-aware Alignment](https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1197/) (Park et al., EMNLP 2025)
ACL