Abstract
This paper focuses on the end-to-end abstractive summarization of a single product review without supervision. We assume that a review can be described as a discourse tree, in which the summary is the root, and the child sentences explain their parent in detail. By recursively estimating a parent from its children, our model learns the latent discourse tree without an external parser and generates a concise summary. We also introduce an architecture that ranks the importance of each sentence on the tree to support summary generation focusing on the main review point. The experimental results demonstrate that our model is competitive with or outperforms other unsupervised approaches. In particular, for relatively long reviews, it achieves a competitive or better performance than supervised models. The induced tree shows that the child sentences provide additional information about their parent, and the generated summary abstracts the entire review.- Anthology ID:
- P19-1206
- Volume:
- Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
- Month:
- July
- Year:
- 2019
- Address:
- Florence, Italy
- Editors:
- Anna Korhonen, David Traum, Lluís Màrquez
- Venue:
- ACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 2142–2152
- Language:
- URL:
- https://aclanthology.org/P19-1206
- DOI:
- 10.18653/v1/P19-1206
- Cite (ACL):
- Masaru Isonuma, Junichiro Mori, and Ichiro Sakata. 2019. Unsupervised Neural Single-Document Summarization of Reviews via Learning Latent Discourse Structure and its Ranking. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 2142–2152, Florence, Italy. Association for Computational Linguistics.
- Cite (Informal):
- Unsupervised Neural Single-Document Summarization of Reviews via Learning Latent Discourse Structure and its Ranking (Isonuma et al., ACL 2019)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-5/P19-1206.pdf
- Code
- misonuma/strsum