@inproceedings{sorokin-gurevych-2017-context,
    title = "Context-Aware Representations for Knowledge Base Relation Extraction",
    author = "Sorokin, Daniil  and
      Gurevych, Iryna",
    editor = "Palmer, Martha  and
      Hwa, Rebecca  and
      Riedel, Sebastian",
    booktitle = "Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing",
    month = sep,
    year = "2017",
    address = "Copenhagen, Denmark",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/iwcs-25-ingestion/D17-1188/",
    doi = "10.18653/v1/D17-1188",
    pages = "1784--1789",
    abstract = "We demonstrate that for sentence-level relation extraction it is beneficial to consider other relations in the sentential context while predicting the target relation. Our architecture uses an LSTM-based encoder to jointly learn representations for all relations in a single sentence. We combine the context representations with an attention mechanism to make the final prediction. We use the Wikidata knowledge base to construct a dataset of multiple relations per sentence and to evaluate our approach. Compared to a baseline system, our method results in an average error reduction of 24 on a held-out set of relations. The code and the dataset to replicate the experiments are made available at \url{https://github.com/ukplab/}."
}Markdown (Informal)
[Context-Aware Representations for Knowledge Base Relation Extraction](https://preview.aclanthology.org/iwcs-25-ingestion/D17-1188/) (Sorokin & Gurevych, EMNLP 2017)
ACL