Using Social Networks to Improve Language Variety Identification with Neural Networks

Yasuhide Miura, Tomoki Taniguchi, Motoki Taniguchi, Shotaro Misawa, Tomoko Ohkuma


Abstract
We propose a hierarchical neural network model for language variety identification that integrates information from a social network. Recently, language variety identification has enjoyed heightened popularity as an advanced task of language identification. The proposed model uses additional texts from a social network to improve language variety identification from two perspectives. First, they are used to introduce the effects of homophily. Secondly, they are used as expanded training data for shared layers of the proposed model. By introducing information from social networks, the model improved its accuracy by 1.67-5.56. Compared to state-of-the-art baselines, these improved performances are better in English and comparable in Spanish. Furthermore, we analyzed the cases of Portuguese and Arabic when the model showed weak performances, and found that the effect of homophily is likely to be weak due to sparsity and noises compared to languages with the strong performances.
Anthology ID:
I17-2045
Volume:
Proceedings of the Eighth International Joint Conference on Natural Language Processing (Volume 2: Short Papers)
Month:
November
Year:
2017
Address:
Taipei, Taiwan
Venue:
IJCNLP
SIG:
Publisher:
Asian Federation of Natural Language Processing
Note:
Pages:
263–270
Language:
URL:
https://aclanthology.org/I17-2045
DOI:
Bibkey:
Cite (ACL):
Yasuhide Miura, Tomoki Taniguchi, Motoki Taniguchi, Shotaro Misawa, and Tomoko Ohkuma. 2017. Using Social Networks to Improve Language Variety Identification with Neural Networks. In Proceedings of the Eighth International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pages 263–270, Taipei, Taiwan. Asian Federation of Natural Language Processing.
Cite (Informal):
Using Social Networks to Improve Language Variety Identification with Neural Networks (Miura et al., IJCNLP 2017)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/I17-2045.pdf