@inproceedings{xie-chi-2024-chemical,
    title = "Could Chemical Language Models benefit from Message Passing",
    author = "Xie, Jiaqing  and
      Chi, Ziheng",
    editor = "Edwards, Carl  and
      Wang, Qingyun  and
      Li, Manling  and
      Zhao, Lawrence  and
      Hope, Tom  and
      Ji, Heng",
    booktitle = "Proceedings of the 1st Workshop on Language + Molecules (L+M 2024)",
    month = aug,
    year = "2024",
    address = "Bangkok, Thailand",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2024.langmol-1.2/",
    doi = "10.18653/v1/2024.langmol-1.2",
    pages = "10--20",
    abstract = "Pretrained language models (LMs) showcase significant capabilities in processing molecular text, while concurrently, message passing neural networks (MPNNs) demonstrate resilience and versatility in the domain of molecular science. Despite these advancements, we find there are limited studies investigating the bidirectional interactions between molecular structures and their corresponding textual representations. Therefore, in this paper, we propose two strategies to evaluate whether an information integration can enhance the performance: contrast learning, which involves utilizing an MPNN to supervise the training of the LM, and fusion, which exploits information from both models. Our empirical analysis reveals that the integration approaches exhibit superior performance compared to baselines when applied to smaller molecular graphs, while these integration approaches do not yield performance enhancements on large scale graphs."
}Markdown (Informal)
[Could Chemical Language Models benefit from Message Passing](https://preview.aclanthology.org/ingest-emnlp/2024.langmol-1.2/) (Xie & Chi, LangMol 2024)
ACL