Abstract
“Pre-trained language models (PLMs) have been widely used in entity and relation extractionmethods in recent years. However, due to the semantic gap between general-domain text usedfor pre-training and domain-specific text, these methods encounter semantic redundancy anddomain semantics insufficiency when it comes to domain-specific tasks. To mitigate this issue,we propose a low-cost and effective knowledge-enhanced method to facilitate domain-specificsemantics modeling in joint entity and relation extraction. Precisely, we use ontology and entitytype descriptions as domain knowledge sources, which are encoded and incorporated into thedownstream entity and relation extraction model to improve its understanding of domain-specificinformation. We construct a dataset called SSUIE-RE for Chinese entity and relation extractionin space science and utilization domain of China Manned Space Engineering, which contains awealth of domain-specific knowledge. The experimental results on SSUIE-RE demonstrate theeffectiveness of our method, achieving a 1.4% absolute improvement in relation F1 score overprevious best approach. Introduction”- Anthology ID:
- 2023.ccl-1.61
- Volume:
- Proceedings of the 22nd Chinese National Conference on Computational Linguistics
- Month:
- August
- Year:
- 2023
- Address:
- Harbin, China
- Editors:
- Maosong Sun, Bing Qin, Xipeng Qiu, Jing Jiang, Xianpei Han
- Venue:
- CCL
- SIG:
- Publisher:
- Chinese Information Processing Society of China
- Note:
- Pages:
- 713–725
- Language:
- English
- URL:
- https://aclanthology.org/2023.ccl-1.61
- DOI:
- Cite (ACL):
- Xiong Xiong, Wang Chen, Liu Yunfei, and Li Shengyang. 2023. Enhancing Ontology Knowledge for Domain-Specific Joint Entity and Relation Extraction. In Proceedings of the 22nd Chinese National Conference on Computational Linguistics, pages 713–725, Harbin, China. Chinese Information Processing Society of China.
- Cite (Informal):
- Enhancing Ontology Knowledge for Domain-Specific Joint Entity and Relation Extraction (Xiong et al., CCL 2023)
- PDF:
- https://preview.aclanthology.org/proper-vol2-ingestion/2023.ccl-1.61.pdf