MEAN: Multi-head Entity Aware Attention Networkfor Political Perspective Detection in News Media

Chang Li, Dan Goldwasser


Abstract
The way information is generated and disseminated has changed dramatically over the last decade. Identifying the political perspective shaping the way events are discussed in the media becomes more important due to the sharp increase in the number of news outlets and articles. Previous approaches usually only leverage linguistic information. However, news articles attempt to maintain credibility and seem impartial. Therefore, bias is introduced in subtle ways, usually by emphasizing different aspects of the story. In this paper, we propose a novel framework that considers entities mentioned in news articles and external knowledge about them, capturing the bias with respect to those entities. We explore different ways to inject entity information into the text model. Experiments show that our proposed framework achieves significant improvements over the standard text models, and is capable of identifying the difference in news narratives with different perspectives.
Anthology ID:
2021.nlp4if-1.10
Volume:
Proceedings of the Fourth Workshop on NLP for Internet Freedom: Censorship, Disinformation, and Propaganda
Month:
June
Year:
2021
Address:
Online
Editors:
Anna Feldman, Giovanni Da San Martino, Chris Leberknight, Preslav Nakov
Venue:
NLP4IF
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
66–75
Language:
URL:
https://aclanthology.org/2021.nlp4if-1.10
DOI:
10.18653/v1/2021.nlp4if-1.10
Bibkey:
Cite (ACL):
Chang Li and Dan Goldwasser. 2021. MEAN: Multi-head Entity Aware Attention Networkfor Political Perspective Detection in News Media. In Proceedings of the Fourth Workshop on NLP for Internet Freedom: Censorship, Disinformation, and Propaganda, pages 66–75, Online. Association for Computational Linguistics.
Cite (Informal):
MEAN: Multi-head Entity Aware Attention Networkfor Political Perspective Detection in News Media (Li & Goldwasser, NLP4IF 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl-24-ws-corrections/2021.nlp4if-1.10.pdf