Contrastive Learning Using Graph Embeddings for Domain Adaptation of Language Models in the Process Industry

Anastasia Zhukova, Jonas Luehrs, Christian Matt, Bela Gipp


Abstract
Recent trends in NLP utilize knowledge graphs (KGs) to enhance pretrained language models by incorporating additional knowledge from the graph structures to learn domain-specific terminology or relationships between documents that might otherwise be overlooked. This paper explores how SciNCL, a graph-aware neighborhood contrastive learning methodology originally designed for scientific publications, can be applied to the process industry domain, where text logs contain crucial information about daily operations and are often structured as sparse KGs. Our experiments demonstrate that language models fine-tuned with triplets derived from graph embeddings (GE) outperform a state-of-the-art mE5-large text encoder by 9.8-14.3% (5.45-7.96p) on the proprietary process industry text embedding benchmark (PITEB) while having 3 times fewer parameters.
Anthology ID:
2025.emnlp-industry.103
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing: Industry Track
Month:
November
Year:
2025
Address:
Suzhou (China)
Editors:
Saloni Potdar, Lina Rojas-Barahona, Sebastien Montella
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1472–1484
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-industry.103/
DOI:
Bibkey:
Cite (ACL):
Anastasia Zhukova, Jonas Luehrs, Christian Matt, and Bela Gipp. 2025. Contrastive Learning Using Graph Embeddings for Domain Adaptation of Language Models in the Process Industry. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing: Industry Track, pages 1472–1484, Suzhou (China). Association for Computational Linguistics.
Cite (Informal):
Contrastive Learning Using Graph Embeddings for Domain Adaptation of Language Models in the Process Industry (Zhukova et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-industry.103.pdf