2024
pdf
abs
The Integration of Semantic and Structural Knowledge in Knowledge Graph Entity Typing
Muzhi Li
|
Minda Hu
|
Irwin King
|
Ho-fung Leung
Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers)
The Knowledge Graph Entity Typing (KGET) task aims to predict missing type annotations for entities in knowledge graphs. Recent works only utilize the structural knowledge in the local neighborhood of entities, disregarding semantic knowledge in the textual representations of entities, relations, and types that are also crucial for type inference. Additionally, we observe that the interaction between semantic and structural knowledge can be utilized to address the false-negative problem. In this paper, we propose a novel Semantic and Structure-aware KG Entity Typing (SSET) framework, which is composed of three modules. First, the Semantic Knowledge Encoding module encodes factual knowledge in the KG with a Masked Entity Typing task. Then, the Structural Knowledge Aggregation module aggregates knowledge from the multi-hop neighborhood of entities to infer missing types. Finally, the Unsupervised Type Re-ranking module utilizes the inference results from the two models above to generate type predictions that are robust to false-negative samples. Extensive experiments show that SSET significantly outperforms existing state-of-the-art methods.
pdf
abs
Abstract-level Deductive Reasoning for Pre-trained Language Models
Xin Wu
|
Yi Cai
|
Ho-fung Leung
Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)
Pre-trained Language Models have been shown to be able to emulate deductive reasoning in natural language. However, PLMs are easily affected by irrelevant information (e.g., entity) in instance-level proofs when learning deductive reasoning. To address this limitation, we propose an Abstract-level Deductive Reasoner (ADR). ADR is trained to predict the abstract reasoning proof of each sample, which guides PLMs to learn general reasoning patterns rather than instance-level knowledge. Experimental results demonstrate that ADR significantly reduces the impact of PLMs learning instance-level knowledge (over 70%).
2021
pdf
IgSEG: Image-guided Story Ending Generation
Qingbao Huang
|
Chuan Huang
|
Linzhang Mo
|
Jielong Wei
|
Yi Cai
|
Ho-fung Leung
|
Qing Li
Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021
2020
pdf
abs
Aligned Dual Channel Graph Convolutional Network for Visual Question Answering
Qingbao Huang
|
Jielong Wei
|
Yi Cai
|
Changmeng Zheng
|
Junying Chen
|
Ho-fung Leung
|
Qing Li
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Visual question answering aims to answer the natural language question about a given image. Existing graph-based methods only focus on the relations between objects in an image and neglect the importance of the syntactic dependency relations between words in a question. To simultaneously capture the relations between objects in an image and the syntactic dependency relations between words in a question, we propose a novel dual channel graph convolutional network (DC-GCN) for better combining visual and textual advantages. The DC-GCN model consists of three parts: an I-GCN module to capture the relations between objects in an image, a Q-GCN module to capture the syntactic dependency relations between words in a question, and an attention alignment module to align image representations and question representations. Experimental results show that our model achieves comparable performance with the state-of-the-art approaches.
2019
pdf
abs
A Boundary-aware Neural Model for Nested Named Entity Recognition
Changmeng Zheng
|
Yi Cai
|
Jingyun Xu
|
Ho-fung Leung
|
Guandong Xu
Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)
In natural language processing, it is common that many entities contain other entities inside them. Most existing works on named entity recognition (NER) only deal with flat entities but ignore nested ones. We propose a boundary-aware neural model for nested NER which leverages entity boundaries to predict entity categorical labels. Our model can locate entities precisely by detecting boundaries using sequence labeling models. Based on the detected boundaries, our model utilizes the boundary-relevant regions to predict entity categorical labels, which can decrease computation cost and relieve error propagation problem in layered sequence labeling model. We introduce multitask learning to capture the dependencies of entity boundaries and their categorical labels, which helps to improve the performance of identifying entities. We conduct our experiments on GENIA dataset and the experimental results demonstrate that our model outperforms other state-of-the-art methods.
2016
pdf
abs
Exploring Topic Discriminating Power of Words in Latent Dirichlet Allocation
Kai Yang
|
Yi Cai
|
Zhenhong Chen
|
Ho-fung Leung
|
Raymond Lau
Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers
Latent Dirichlet Allocation (LDA) and its variants have been widely used to discover latent topics in textual documents. However, some of topics generated by LDA may be noisy with irrelevant words scattering across these topics. We name this kind of words as topic-indiscriminate words, which tend to make topics more ambiguous and less interpretable by humans. In our work, we propose a new topic model named TWLDA, which assigns low weights to words with low topic discriminating power (ability). Our experimental results show that the proposed approach, which effectively reduces the number of topic-indiscriminate words in discovered topics, improves the effectiveness of LDA.