@inproceedings{liu-etal-2019-attention,
    title = "Attention Neural Model for Temporal Relation Extraction",
    author = "Liu, Sijia  and
      Wang, Liwei  and
      Chaudhary, Vipin  and
      Liu, Hongfang",
    editor = "Rumshisky, Anna  and
      Roberts, Kirk  and
      Bethard, Steven  and
      Naumann, Tristan",
    booktitle = "Proceedings of the 2nd Clinical Natural Language Processing Workshop",
    month = jun,
    year = "2019",
    address = "Minneapolis, Minnesota, USA",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/iwcs-25-ingestion/W19-1917/",
    doi = "10.18653/v1/W19-1917",
    pages = "134--139",
    abstract = "Neural network models have shown promise in the temporal relation extraction task. In this paper, we present the attention based neural network model to extract the containment relations within sentences from clinical narratives. The attention mechanism used on top of GRU model outperforms the existing state-of-the-art neural network models on THYME corpus in intra-sentence temporal relation extraction."
}Markdown (Informal)
[Attention Neural Model for Temporal Relation Extraction](https://preview.aclanthology.org/iwcs-25-ingestion/W19-1917/) (Liu et al., ClinicalNLP 2019)
ACL
- Sijia Liu, Liwei Wang, Vipin Chaudhary, and Hongfang Liu. 2019. Attention Neural Model for Temporal Relation Extraction. In Proceedings of the 2nd Clinical Natural Language Processing Workshop, pages 134–139, Minneapolis, Minnesota, USA. Association for Computational Linguistics.