@inproceedings{song-etal-2021-new,
    title = "A New Approach to Overgenerating and Scoring Abstractive Summaries",
    author = "Song, Kaiqiang  and
      Wang, Bingqing  and
      Feng, Zhe  and
      Liu, Fei",
    editor = "Toutanova, Kristina  and
      Rumshisky, Anna  and
      Zettlemoyer, Luke  and
      Hakkani-Tur, Dilek  and
      Beltagy, Iz  and
      Bethard, Steven  and
      Cotterell, Ryan  and
      Chakraborty, Tanmoy  and
      Zhou, Yichao",
    booktitle = "Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies",
    month = jun,
    year = "2021",
    address = "Online",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2021.naacl-main.110/",
    doi = "10.18653/v1/2021.naacl-main.110",
    pages = "1392--1404",
    abstract = "We propose a new approach to generate multiple variants of the target summary with diverse content and varying lengths, then score and select admissible ones according to users' needs. Abstractive summarizers trained on single reference summaries may struggle to produce outputs that achieve multiple desirable properties, i.e., capturing the most important information, being faithful to the original, grammatical and fluent. In this paper, we propose a two-staged strategy to generate a diverse set of candidate summaries from the source text in stage one, then score and select admissible ones in stage two. Importantly, our generator gives a precise control over the length of the summary, which is especially well-suited when space is limited. Our selectors are designed to predict the optimal summary length and put special emphasis on faithfulness to the original text. Both stages can be effectively trained, optimized and evaluated. Our experiments on benchmark summarization datasets suggest that this paradigm can achieve state-of-the-art performance."
}Markdown (Informal)
[A New Approach to Overgenerating and Scoring Abstractive Summaries](https://preview.aclanthology.org/ingest-emnlp/2021.naacl-main.110/) (Song et al., NAACL 2021)
ACL
- Kaiqiang Song, Bingqing Wang, Zhe Feng, and Fei Liu. 2021. A New Approach to Overgenerating and Scoring Abstractive Summaries. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 1392–1404, Online. Association for Computational Linguistics.