NNBlocks: A Deep Learning Framework for Computational Linguistics Neural Network Models

Frederico Tommasi Caroli, André Freitas, João Carlos Pereira da Silva, Siegfried Handschuh


Abstract
Lately, with the success of Deep Learning techniques in some computational linguistics tasks, many researchers want to explore new models for their linguistics applications. These models tend to be very different from what standard Neural Networks look like, limiting the possibility to use standard Neural Networks frameworks. This work presents NNBlocks, a new framework written in Python to build and train Neural Networks that are not constrained by a specific kind of architecture, making it possible to use it in computational linguistics.
Anthology ID:
L16-1330
Volume:
Proceedings of the Tenth International Conference on Language Resources and Evaluation (LREC'16)
Month:
May
Year:
2016
Address:
Portorož, Slovenia
Editors:
Nicoletta Calzolari, Khalid Choukri, Thierry Declerck, Sara Goggi, Marko Grobelnik, Bente Maegaard, Joseph Mariani, Helene Mazo, Asuncion Moreno, Jan Odijk, Stelios Piperidis
Venue:
LREC
SIG:
Publisher:
European Language Resources Association (ELRA)
Note:
Pages:
2081–2085
Language:
URL:
https://aclanthology.org/L16-1330
DOI:
Bibkey:
Cite (ACL):
Frederico Tommasi Caroli, André Freitas, João Carlos Pereira da Silva, and Siegfried Handschuh. 2016. NNBlocks: A Deep Learning Framework for Computational Linguistics Neural Network Models. In Proceedings of the Tenth International Conference on Language Resources and Evaluation (LREC'16), pages 2081–2085, Portorož, Slovenia. European Language Resources Association (ELRA).
Cite (Informal):
NNBlocks: A Deep Learning Framework for Computational Linguistics Neural Network Models (Caroli et al., LREC 2016)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-5/L16-1330.pdf