Abstract
In medical domain, given a medical question, it is difficult to manually select the most relevant information from a large number of search results. BioNLP 2019 proposes Question Answering (QA) task, which encourages the use of text mining technology to automatically judge whether a search result is an answer to the medical question. The main challenge of QA task is how to mine the semantic relation between question and answer. We propose BioBERT Transformer model to tackle this challenge, which applies Transformers to extract semantic relation between different words in questions and answers. Furthermore, BioBERT is utilized to encode medical domain-specific contextualized word representations. Our method has reached the accuracy of 76.24% and spearman of 17.12% on the BioNLP 2019 QA task.- Anthology ID:
- W19-5047
- Volume:
- Proceedings of the 18th BioNLP Workshop and Shared Task
- Month:
- August
- Year:
- 2019
- Address:
- Florence, Italy
- Editors:
- Dina Demner-Fushman, Kevin Bretonnel Cohen, Sophia Ananiadou, Junichi Tsujii
- Venue:
- BioNLP
- SIG:
- SIGBIOMED
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 446–452
- Language:
- URL:
- https://aclanthology.org/W19-5047
- DOI:
- 10.18653/v1/W19-5047
- Cite (ACL):
- Huiwei Zhou, Bizun Lei, Zhe Liu, and Zhuang Liu. 2019. DUT-BIM at MEDIQA 2019: Utilizing Transformer Network and Medical Domain-Specific Contextualized Representations for Question Answering. In Proceedings of the 18th BioNLP Workshop and Shared Task, pages 446–452, Florence, Italy. Association for Computational Linguistics.
- Cite (Informal):
- DUT-BIM at MEDIQA 2019: Utilizing Transformer Network and Medical Domain-Specific Contextualized Representations for Question Answering (Zhou et al., BioNLP 2019)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-3/W19-5047.pdf