@inproceedings{egonmwan-etal-2019-cross,
    title = "Cross-Task Knowledge Transfer for Query-Based Text Summarization",
    author = "Egonmwan, Elozino  and
      Castelli, Vittorio  and
      Sultan, Md Arafat",
    editor = "Fisch, Adam  and
      Talmor, Alon  and
      Jia, Robin  and
      Seo, Minjoon  and
      Choi, Eunsol  and
      Chen, Danqi",
    booktitle = "Proceedings of the 2nd Workshop on Machine Reading for Question Answering",
    month = nov,
    year = "2019",
    address = "Hong Kong, China",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/D19-5810/",
    doi = "10.18653/v1/D19-5810",
    pages = "72--77",
    abstract = "We demonstrate the viability of knowledge transfer between two related tasks: machine reading comprehension (MRC) and query-based text summarization. Using an MRC model trained on the SQuAD1.1 dataset as a core system component, we first build an extractive query-based summarizer. For better precision, this summarizer also compresses the output of the MRC model using a novel sentence compression technique. We further leverage pre-trained machine translation systems to abstract our extracted summaries. Our models achieve state-of-the-art results on the publicly available CNN/Daily Mail and Debatepedia datasets, and can serve as simple yet powerful baselines for future systems. We also hope that these results will encourage research on transfer learning from large MRC corpora to query-based summarization."
}Markdown (Informal)
[Cross-Task Knowledge Transfer for Query-Based Text Summarization](https://preview.aclanthology.org/ingest-emnlp/D19-5810/) (Egonmwan et al., 2019)
ACL