@inproceedings{lebanoff-etal-2021-modeling,
    title = "Modeling Endorsement for Multi-Document Abstractive Summarization",
    author = "Lebanoff, Logan  and
      Wang, Bingqing  and
      Feng, Zhe  and
      Liu, Fei",
    editor = "Carenini, Giuseppe  and
      Cheung, Jackie Chi Kit  and
      Dong, Yue  and
      Liu, Fei  and
      Wang, Lu",
    booktitle = "Proceedings of the Third Workshop on New Frontiers in Summarization",
    month = nov,
    year = "2021",
    address = "Online and in Dominican Republic",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2021.newsum-1.13/",
    doi = "10.18653/v1/2021.newsum-1.13",
    pages = "119--130",
    abstract = "A crucial difference between single- and multi-document summarization is how salient content manifests itself in the document(s). While such content may appear at the beginning of a single document, essential information is frequently reiterated in a set of documents related to a particular topic, resulting in an endorsement effect that increases information salience. In this paper, we model the cross-document endorsement effect and its utilization in multiple document summarization. Our method generates a synopsis from each document, which serves as an endorser to identify salient content from other documents. Strongly endorsed text segments are used to enrich a neural encoder-decoder model to consolidate them into an abstractive summary. The method has a great potential to learn from fewer examples to identify salient content, which alleviates the need for costly retraining when the set of documents is dynamically adjusted. Through extensive experiments on benchmark multi-document summarization datasets, we demonstrate the effectiveness of our proposed method over strong published baselines. Finally, we shed light on future research directions and discuss broader challenges of this task using a case study."
}Markdown (Informal)
[Modeling Endorsement for Multi-Document Abstractive Summarization](https://preview.aclanthology.org/ingest-emnlp/2021.newsum-1.13/) (Lebanoff et al., NewSum 2021)
ACL