@inproceedings{fan-etal-2019-using,
    title = "Using Local Knowledge Graph Construction to Scale {S}eq2{S}eq Models to Multi-Document Inputs",
    author = "Fan, Angela  and
      Gardent, Claire  and
      Braud, Chlo{\'e}  and
      Bordes, Antoine",
    editor = "Inui, Kentaro  and
      Jiang, Jing  and
      Ng, Vincent  and
      Wan, Xiaojun",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)",
    month = nov,
    year = "2019",
    address = "Hong Kong, China",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/iwcs-25-ingestion/D19-1428/",
    doi = "10.18653/v1/D19-1428",
    pages = "4186--4196",
    abstract = "Query-based open-domain NLP tasks require information synthesis from long and diverse web results. Current approaches extractively select portions of web text as input to Sequence-to-Sequence models using methods such as TF-IDF ranking. We propose constructing a local graph structured knowledge base for each query, which compresses the web search information and reduces redundancy. We show that by linearizing the graph into a structured input sequence, models can encode the graph representations within a standard Sequence-to-Sequence setting. For two generative tasks with very long text input, long-form question answering and multi-document summarization, feeding graph representations as input can achieve better performance than using retrieved text portions."
}Markdown (Informal)
[Using Local Knowledge Graph Construction to Scale Seq2Seq Models to Multi-Document Inputs](https://preview.aclanthology.org/iwcs-25-ingestion/D19-1428/) (Fan et al., EMNLP-IJCNLP 2019)
ACL