Enhancing Antimicrobial Drug Resistance Classification by Integrating Sequence-Based and Text-Based Representations

Hyunwoo Yoo, Bahrad Sokhansanj, James Brown


Abstract
Antibiotic resistance identification is essential for public health, medical treatment, and drug development. Traditional sequence-based models struggle with accurate resistance prediction due to the lack of biological context. To address this, we propose an NLP-based model that integrates genetic sequences with structured textual annotations, including gene family classifications and resistance mechanisms. Our approach leverages pretrained language models for both genetic sequences and biomedical text, aligning biological metadata with sequence-based embeddings. We construct a novel dataset based on the Antibiotic Resistance Ontology (ARO), consolidating gene sequences with resistance-related textual information. Experiments show that incorporating domain knowledge significantly improves classification accuracy over sequence-only models, reducing reliance on exhaustive laboratory testing. By integrating genetic sequence processing with biomedical text understanding, our approach provides a scalable and interpretable solution for antibiotic resistance prediction.
Anthology ID:
2025.bionlp-1.23
Volume:
ACL 2025
Month:
August
Year:
2025
Address:
Viena, Austria
Editors:
Dina Demner-Fushman, Sophia Ananiadou, Makoto Miwa, Junichi Tsujii
Venues:
BioNLP | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
263–273
Language:
URL:
https://preview.aclanthology.org/acl25-workshop-ingestion/2025.bionlp-1.23/
DOI:
Bibkey:
Cite (ACL):
Hyunwoo Yoo, Bahrad Sokhansanj, and James Brown. 2025. Enhancing Antimicrobial Drug Resistance Classification by Integrating Sequence-Based and Text-Based Representations. In ACL 2025, pages 263–273, Viena, Austria. Association for Computational Linguistics.
Cite (Informal):
Enhancing Antimicrobial Drug Resistance Classification by Integrating Sequence-Based and Text-Based Representations (Yoo et al., BioNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/acl25-workshop-ingestion/2025.bionlp-1.23.pdf
Supplementarymaterial:
 2025.bionlp-1.23.SupplementaryMaterial.txt