Abstract
Most of the existing studies of language use in social media content have focused on the surface-level linguistic features (e.g., function words and punctuation marks) and the semantic level aspects (e.g., the topics, sentiment, and emotions) of the comments. The writer’s strategies of constructing and connecting text segments have not been widely explored even though this knowledge is expected to shed light on how people reason in online environments. Contributing to this analysis direction for social media studies, we build an openly accessible neural RST parsing system that analyzes discourse relations in an online comment. Our experiments demonstrate that this system achieves comparable performance among all the neural RST parsing systems. To demonstrate the use of this tool in social media analysis, we apply it to identify the discourse relations in persuasive and non-persuasive comments and examine the relationships among the binary discourse tree depth, discourse relations, and the perceived persuasiveness of online comments. Our work demonstrates the potential of analyzing discourse structures of online comments with our system and the implications of these structures for understanding online communications.- Anthology ID:
- 2021.wnut-1.30
- Volume:
- Proceedings of the Seventh Workshop on Noisy User-generated Text (W-NUT 2021)
- Month:
- November
- Year:
- 2021
- Address:
- Online
- Editors:
- Wei Xu, Alan Ritter, Tim Baldwin, Afshin Rahimi
- Venue:
- WNUT
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 274–283
- Language:
- URL:
- https://aclanthology.org/2021.wnut-1.30
- DOI:
- 10.18653/v1/2021.wnut-1.30
- Cite (ACL):
- Jinfen Li and Lu Xiao. 2021. Neural-based RST Parsing And Analysis In Persuasive Discourse. In Proceedings of the Seventh Workshop on Noisy User-generated Text (W-NUT 2021), pages 274–283, Online. Association for Computational Linguistics.
- Cite (Informal):
- Neural-based RST Parsing And Analysis In Persuasive Discourse (Li & Xiao, WNUT 2021)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-4/2021.wnut-1.30.pdf