Giancarlo Salton
Also published as: Giancarlo D. Salton
2021
Formulating Automated Responses to Cognitive Distortions for CBT Interactions
Ignacio de Toledo Rodriguez | Giancarlo Salton | Robert Ross
Proceedings of the 4th International Conference on Natural Language and Speech Processing (ICNLSP 2021)
Ignacio de Toledo Rodriguez | Giancarlo Salton | Robert Ross
Proceedings of the 4th International Conference on Natural Language and Speech Processing (ICNLSP 2021)
2019
Persistence pays off: Paying Attention to What the LSTM Gating Mechanism Persists
Giancarlo Salton | John Kelleher
Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2019)
Giancarlo Salton | John Kelleher
Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2019)
Recurrent Neural Network Language Models composed of LSTM units, especially those augmented with an external memory, have achieved state-of-the-art results in Language Modeling. However, these models still struggle to process long sequences which are more likely to contain long-distance dependencies because of information fading. In this paper we demonstrate an effective mechanism for retrieving information in a memory augmented LSTM LM based on attending to information in memory in proportion to the number of timesteps the LSTM gating mechanism persisted the information.
2018
Is it worth it? Budget-related evaluation metrics for model selection
Filip Klubička | Giancarlo D. Salton | John D. Kelleher
Proceedings of the Eleventh International Conference on Language Resources and Evaluation (LREC 2018)
Filip Klubička | Giancarlo D. Salton | John D. Kelleher
Proceedings of the Eleventh International Conference on Language Resources and Evaluation (LREC 2018)
2017
Attentive Language Models
Giancarlo Salton | Robert Ross | John Kelleher
Proceedings of the Eighth International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
Giancarlo Salton | Robert Ross | John Kelleher
Proceedings of the Eighth International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
In this paper, we extend Recurrent Neural Network Language Models (RNN-LMs) with an attention mechanism. We show that an “attentive” RNN-LM (with 11M parameters) achieves a better perplexity than larger RNN-LMs (with 66M parameters) and achieves performance comparable to an ensemble of 10 similar sized RNN-LMs. We also show that an “attentive” RNN-LM needs less contextual information to achieve similar results to the state-of-the-art on the wikitext2 dataset.
Idiom Type Identification with Smoothed Lexical Features and a Maximum Margin Classifier
Giancarlo Salton | Robert Ross | John Kelleher
Proceedings of the International Conference Recent Advances in Natural Language Processing, RANLP 2017
Giancarlo Salton | Robert Ross | John Kelleher
Proceedings of the International Conference Recent Advances in Natural Language Processing, RANLP 2017
In our work we address limitations in the state-of-the-art in idiom type identification. We investigate different approaches for a lexical fixedness metric, a component of the state-of the-art model. We also show that our Machine Learning based approach to the idiom type identification task achieves an F1-score of 0.85, an improvement of 11 points over the state-of the-art.
2016
Idiom Token Classification using Sentential Distributed Semantics
Giancarlo Salton | Robert Ross | John Kelleher
Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Giancarlo Salton | Robert Ross | John Kelleher
Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)