Evaluating Biomedical Word Embeddings for Vocabulary Alignment at Scale in the UMLS Metathesaurus Using Siamese Networks

Goonmeet Bajaj, Vinh Nguyen, Thilini Wijesiriwardene, Hong Yung Yip, Vishesh Javangula, Amit Sheth, Srinivasan Parthasarathy, Olivier Bodenreider


Abstract
Recent work uses a Siamese Network, initialized with BioWordVec embeddings (distributed word embeddings), for predicting synonymy among biomedical terms to automate a part of the UMLS (Unified Medical Language System) Metathesaurus construction process. We evaluate the use of contextualized word embeddings extracted from nine different biomedical BERT-based models for synonym prediction in the UMLS by replacing BioWordVec embeddings with embeddings extracted from each biomedical BERT model using different feature extraction methods. Finally, we conduct a thorough grid search, which prior work lacks, to find the best set of hyperparameters. Surprisingly, we find that Siamese Networks initialized with BioWordVec embeddings still out perform the Siamese Networks initialized with embedding extracted from biomedical BERT model.
Anthology ID:
2022.insights-1.11
Volume:
Proceedings of the Third Workshop on Insights from Negative Results in NLP
Month:
May
Year:
2022
Address:
Dublin, Ireland
Venue:
insights
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
82–87
Language:
URL:
https://aclanthology.org/2022.insights-1.11
DOI:
10.18653/v1/2022.insights-1.11
Bibkey:
Cite (ACL):
Goonmeet Bajaj, Vinh Nguyen, Thilini Wijesiriwardene, Hong Yung Yip, Vishesh Javangula, Amit Sheth, Srinivasan Parthasarathy, and Olivier Bodenreider. 2022. Evaluating Biomedical Word Embeddings for Vocabulary Alignment at Scale in the UMLS Metathesaurus Using Siamese Networks. In Proceedings of the Third Workshop on Insights from Negative Results in NLP, pages 82–87, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Evaluating Biomedical Word Embeddings for Vocabulary Alignment at Scale in the UMLS Metathesaurus Using Siamese Networks (Bajaj et al., insights 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/auto-file-uploads/2022.insights-1.11.pdf
Video:
 https://preview.aclanthology.org/auto-file-uploads/2022.insights-1.11.mp4