Giancarlo Salton

Also published as: Giancarlo D. Salton


2021

pdf bib
Formulating Automated Responses to Cognitive Distortions for CBT Interactions
Ignacio de Toledo Rodriguez | Giancarlo Salton | Robert Ross
Proceedings of The Fourth International Conference on Natural Language and Speech Processing (ICNLSP 2021)

2019

pdf bib
Persistence pays off: Paying Attention to What the LSTM Gating Mechanism Persists
Giancarlo Salton | John Kelleher
Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2019)

Recurrent Neural Network Language Models composed of LSTM units, especially those augmented with an external memory, have achieved state-of-the-art results in Language Modeling. However, these models still struggle to process long sequences which are more likely to contain long-distance dependencies because of information fading. In this paper we demonstrate an effective mechanism for retrieving information in a memory augmented LSTM LM based on attending to information in memory in proportion to the number of timesteps the LSTM gating mechanism persisted the information.

2018

pdf bib
Is it worth it? Budget-related evaluation metrics for model selection
Filip Klubička | Giancarlo D. Salton | John D. Kelleher
Proceedings of the Eleventh International Conference on Language Resources and Evaluation (LREC 2018)

2017

pdf bib
Attentive Language Models
Giancarlo Salton | Robert Ross | John Kelleher
Proceedings of the Eighth International Joint Conference on Natural Language Processing (Volume 1: Long Papers)

In this paper, we extend Recurrent Neural Network Language Models (RNN-LMs) with an attention mechanism. We show that an “attentive” RNN-LM (with 11M parameters) achieves a better perplexity than larger RNN-LMs (with 66M parameters) and achieves performance comparable to an ensemble of 10 similar sized RNN-LMs. We also show that an “attentive” RNN-LM needs less contextual information to achieve similar results to the state-of-the-art on the wikitext2 dataset.

pdf bib
Idiom Type Identification with Smoothed Lexical Features and a Maximum Margin Classifier
Giancarlo Salton | Robert Ross | John Kelleher
Proceedings of the International Conference Recent Advances in Natural Language Processing, RANLP 2017

In our work we address limitations in the state-of-the-art in idiom type identification. We investigate different approaches for a lexical fixedness metric, a component of the state-of the-art model. We also show that our Machine Learning based approach to the idiom type identification task achieves an F1-score of 0.85, an improvement of 11 points over the state-of the-art.

2016

pdf bib
Idiom Token Classification using Sentential Distributed Semantics
Giancarlo Salton | Robert Ross | John Kelleher
Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)

2014

pdf bib
Evaluation of a Substitution Method for Idiom Transformation in Statistical Machine Translation
Giancarlo Salton | Robert Ross | John Kelleher
Proceedings of the 10th Workshop on Multiword Expressions (MWE)

pdf bib
An Empirical Study of the Impact of Idioms on Phrase Based Statistical Machine Translation of English to Brazilian-Portuguese
Giancarlo Salton | Robert Ross | John Kelleher
Proceedings of the 3rd Workshop on Hybrid Approaches to Machine Translation (HyTra)