Flat Multi-modal Interaction Transformer for Named Entity Recognition

Junyu Lu, Dixiang Zhang, Jiaxing Zhang, Pingjian Zhang


Abstract
Multi-modal named entity recognition (MNER) aims at identifying entity spans and recognizing their categories in social media posts with the aid of images. However, in dominant MNER approaches, the interaction of different modalities is usually carried out through the alternation of self-attention and cross-attention or over-reliance on the gating machine, which results in imprecise and biased correspondence between fine-grained semantic units of text and image. To address this issue, we propose a Flat Multi-modal Interaction Transformer (FMIT) for MNER. Specifically, we first utilize noun phrases in sentences and general domain words to obtain visual cues. Then, we transform the fine-grained semantic representation of the vision and text into a unified lattice structure and design a novel relative position encoding to match different modalities in Transformer. Meanwhile, we propose to leverage entity boundary detection as an auxiliary task to alleviate visual bias. Experiments show that our methods achieve the new state-of-the-art performance on two benchmark datasets.
Anthology ID:
2022.coling-1.179
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
2055–2064
Language:
URL:
https://aclanthology.org/2022.coling-1.179
DOI:
Bibkey:
Cite (ACL):
Junyu Lu, Dixiang Zhang, Jiaxing Zhang, and Pingjian Zhang. 2022. Flat Multi-modal Interaction Transformer for Named Entity Recognition. In Proceedings of the 29th International Conference on Computational Linguistics, pages 2055–2064, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
Flat Multi-modal Interaction Transformer for Named Entity Recognition (Lu et al., COLING 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2022.coling-1.179.pdf