Mathias Niepert


2019

pdf bib
Attending to Future Tokens for Bidirectional Sequence Generation
Carolin Lawrence | Bhushan Kotnis | Mathias Niepert
Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)

Neural sequence generation is typically performed token-by-token and left-to-right. Whenever a token is generated only previously produced tokens are taken into consideration. In contrast, for problems such as sequence classification, bidirectional attention, which takes both past and future tokens into consideration, has been shown to perform much better. We propose to make the sequence generation process bidirectional by employing special placeholder tokens. Treated as a node in a fully connected graph, a placeholder token can take past and future tokens into consideration when generating the actual output token. We verify the effectiveness of our approach experimentally on two conversational tasks where the proposed bidirectional model outperforms competitive baselines by a large margin.

pdf bib
Cross-Sentence N-ary Relation Extraction using Lower-Arity Universal Schemas
Kosuke Akimoto | Takuya Hiraoka | Kunihiko Sadamasa | Mathias Niepert
Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)

Most existing relation extraction approaches exclusively target binary relations, and n-ary relation extraction is relatively unexplored. Current state-of-the-art n-ary relation extraction method is based on a supervised learning approach and, therefore, may suffer from the lack of sufficient relation labels. In this paper, we propose a novel approach to cross-sentence n-ary relation extraction based on universal schemas. To alleviate the sparsity problem and to leverage inherent decomposability of n-ary relations, we propose to learn relation representations of lower-arity facts that result from decomposing higher-arity facts. The proposed method computes a score of a new n-ary fact by aggregating scores of its decomposed lower-arity facts. We conduct experiments with datasets for ternary relation extraction and empirically show that our method improves the n-ary relation extraction performance compared to previous methods.

2018

pdf bib
LRMM: Learning to Recommend with Missing Modalities
Cheng Wang | Mathias Niepert | Hui Li
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing

Multimodal learning has shown promising performance in content-based recommendation due to the auxiliary user and item information of multiple modalities such as text and images. However, the problem of incomplete and missing modality is rarely explored and most existing methods fail in learning a recommendation model with missing or corrupted modalities. In this paper, we propose LRMM, a novel framework that mitigates not only the problem of missing modalities but also more generally the cold-start problem of recommender systems. We propose modality dropout (m-drop) and a multimodal sequential autoencoder (m-auto) to learn multimodal representations for complementing and imputing missing modalities. Extensive experiments on real-world Amazon data show that LRMM achieves state-of-the-art performance on rating prediction tasks. More importantly, LRMM is more robust to previous methods in alleviating data-sparsity and the cold-start problem.

pdf bib
Learning Sequence Encoders for Temporal Knowledge Graph Completion
Alberto García-Durán | Sebastijan Dumančić | Mathias Niepert
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing

Research on link prediction in knowledge graphs has mainly focused on static multi-relational data. In this work we consider temporal knowledge graphs where relations between entities may only hold for a time interval or a specific point in time. In line with previous work on static knowledge graphs, we propose to address this problem by learning latent entity and relation type representations. To incorporate temporal information, we utilize recurrent neural networks to learn time-aware representations of relation types which can be used in conjunction with existing latent factorization methods. The proposed approach is shown to be robust to common challenges in real-world KGs: the sparsity and heterogeneity of temporal expressions. Experiments show the benefits of our approach on four temporal KGs. The data sets are available under a permissive BSD-3 license.

2012

pdf bib
Towards Distributed MCMC Inference in Probabilistic Knowledge Bases
Mathias Niepert | Christian Meilicke | Heiner Stuckenschmidt
Proceedings of the Joint Workshop on Automatic Knowledge Base Construction and Web-scale Knowledge Extraction (AKBC-WEKEX)

2011

pdf bib
Fine-Grained Sentiment Analysis with Structural Features
Cäcilia Zirn | Mathias Niepert | Heiner Stuckenschmidt | Michael Strube
Proceedings of 5th International Joint Conference on Natural Language Processing