Exploring Transformers for Ranking Portuguese Semantic Relations

Hugo Gonçalo Oliveira


Abstract
We explored transformer-based language models for ranking instances of Portuguese lexico-semantic relations. Weights were based on the likelihood of natural language sequences that transmitted the relation instances, and expectations were that they would be useful for filtering out noisier instances. However, after analysing the weights, no strong conclusions were taken. They are not correlated with redundancy, but are lower for instances with longer and more specific arguments, which may nevertheless be a consequence of their sensitivity to the frequency of such arguments. They did also not reveal to be useful when computing word similarity with network embeddings. Despite the negative results, we see the reported experiments and insights as another contribution for better understanding transformer language models like BERT and GPT, and we make the weighted instances publicly available for further research.
Anthology ID:
2022.lrec-1.275
Volume:
Proceedings of the Thirteenth Language Resources and Evaluation Conference
Month:
June
Year:
2022
Address:
Marseille, France
Venue:
LREC
SIG:
Publisher:
European Language Resources Association
Note:
Pages:
2573–2582
Language:
URL:
https://aclanthology.org/2022.lrec-1.275
DOI:
Bibkey:
Cite (ACL):
Hugo Gonçalo Oliveira. 2022. Exploring Transformers for Ranking Portuguese Semantic Relations. In Proceedings of the Thirteenth Language Resources and Evaluation Conference, pages 2573–2582, Marseille, France. European Language Resources Association.
Cite (Informal):
Exploring Transformers for Ranking Portuguese Semantic Relations (Gonçalo Oliveira, LREC 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/auto-file-uploads/2022.lrec-1.275.pdf