Retrieval of the Best Counterargument without Prior Topic Knowledge

Henning Wachsmuth, Shahbaz Syed, Benno Stein


Abstract
Given any argument on any controversial topic, how to counter it? This question implies the challenging retrieval task of finding the best counterargument. Since prior knowledge of a topic cannot be expected in general, we hypothesize the best counterargument to invoke the same aspects as the argument while having the opposite stance. To operationalize our hypothesis, we simultaneously model the similarity and dissimilarity of pairs of arguments, based on the words and embeddings of the arguments’ premises and conclusions. A salient property of our model is its independence from the topic at hand, i.e., it applies to arbitrary arguments. We evaluate different model variations on millions of argument pairs derived from the web portal idebate.org. Systematic ranking experiments suggest that our hypothesis is true for many arguments: For 7.6 candidates with opposing stance on average, we rank the best counterargument highest with 60% accuracy. Even among all 2801 test set pairs as candidates, we still find the best one about every third time.
Anthology ID:
P18-1023
Volume:
Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2018
Address:
Melbourne, Australia
Editors:
Iryna Gurevych, Yusuke Miyao
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
241–251
Language:
URL:
https://aclanthology.org/P18-1023
DOI:
10.18653/v1/P18-1023
Bibkey:
Cite (ACL):
Henning Wachsmuth, Shahbaz Syed, and Benno Stein. 2018. Retrieval of the Best Counterargument without Prior Topic Knowledge. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 241–251, Melbourne, Australia. Association for Computational Linguistics.
Cite (Informal):
Retrieval of the Best Counterargument without Prior Topic Knowledge (Wachsmuth et al., ACL 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl-24-ws-corrections/P18-1023.pdf
Note:
 P18-1023.Notes.pdf
Video:
 https://preview.aclanthology.org/naacl-24-ws-corrections/P18-1023.mp4
Data
ConceptNet