Reproducing and Regularizing the SCRN Model

Olzhas Kabdolov, Zhenisbek Assylbekov, Rustem Takhanov


Abstract
We reproduce the Structurally Constrained Recurrent Network (SCRN) model, and then regularize it using the existing widespread techniques, such as naive dropout, variational dropout, and weight tying. We show that when regularized and optimized appropriately the SCRN model can achieve performance comparable with the ubiquitous LSTM model in language modeling task on English data, while outperforming it on non-English data.
Anthology ID:
C18-1145
Volume:
Proceedings of the 27th International Conference on Computational Linguistics
Month:
August
Year:
2018
Address:
Santa Fe, New Mexico, USA
Editors:
Emily M. Bender, Leon Derczynski, Pierre Isabelle
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1705–1716
Language:
URL:
https://aclanthology.org/C18-1145
DOI:
Bibkey:
Cite (ACL):
Olzhas Kabdolov, Zhenisbek Assylbekov, and Rustem Takhanov. 2018. Reproducing and Regularizing the SCRN Model. In Proceedings of the 27th International Conference on Computational Linguistics, pages 1705–1716, Santa Fe, New Mexico, USA. Association for Computational Linguistics.
Cite (Informal):
Reproducing and Regularizing the SCRN Model (Kabdolov et al., COLING 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/C18-1145.pdf
Code
 zh3nis/scrn
Data
WikiText-2