Abstract
Named entity recognition and relation extraction are two important fundamental problems. Joint learning algorithms have been proposed to solve both tasks simultaneously, and many of them cast the joint task as a table-filling problem. However, they typically focused on learning a single encoder (usually learning representation in the form of a table) to capture information required for both tasks within the same space. We argue that it can be beneficial to design two distinct encoders to capture such two different types of information in the learning process. In this work, we propose the novel table-sequence encoders where two different encoders – a table encoder and a sequence encoder are designed to help each other in the representation learning process. Our experiments confirm the advantages of having two encoders over one encoder. On several standard datasets, our model shows significant improvements over existing approaches.- Anthology ID:
- 2020.emnlp-main.133
- Volume:
- Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
- Month:
- November
- Year:
- 2020
- Address:
- Online
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 1706–1721
- Language:
- URL:
- https://aclanthology.org/2020.emnlp-main.133
- DOI:
- 10.18653/v1/2020.emnlp-main.133
- Cite (ACL):
- Jue Wang and Wei Lu. 2020. Two are Better than One: Joint Entity and Relation Extraction with Table-Sequence Encoders. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 1706–1721, Online. Association for Computational Linguistics.
- Cite (Informal):
- Two are Better than One: Joint Entity and Relation Extraction with Table-Sequence Encoders (Wang & Lu, EMNLP 2020)
- PDF:
- https://preview.aclanthology.org/ingestion-script-update/2020.emnlp-main.133.pdf
- Code
- LorrinWWW/two-are-better-than-one + additional community code
- Data
- ACE 2004, ACE 2005, Adverse Drug Events (ADE) Corpus, FewRel, Wiki-ZSL