Recurrent Neural Networks with Mixed Hierarchical Structures and EM Algorithm for Natural Language Processing

Zhaoxin Luo, Michael Zhu


Abstract
How to obtain hierarchical representations with an increasing level of abstraction becomes one of the key issues of learning with deep neural networks. A variety of RNN models have recently been proposed to incorporate both explicit and implicit hierarchical information in modeling languages in the literature. In this paper, we propose a novel approach called the latent indicator layer to identify and learn implicit hierarchical information (e.g., phrases), and further develop an EM algorithm to handle the latent indicator layer in training. The latent indicator layer further simplifies a text’s hierarchical structure, which allows us to seamlessly integrate different levels of attention mechanisms into the structure. We called the resulting architecture as the EM-HRNN model. Furthermore, we develop two bootstrap strategies to effectively and efficiently train the EM-HRNN model on long text documents. Simulation studies and real data applications demonstrate that the EM-HRNN model with bootstrap training outperforms other RNN-based models in document classification tasks. The performance of the EM-HRNN model is comparable to a Transformer-based method called Bert-base, though the former is much smaller model and does not require pre-training.
Anthology ID:
2022.lrec-1.656
Volume:
Proceedings of the Thirteenth Language Resources and Evaluation Conference
Month:
June
Year:
2022
Address:
Marseille, France
Editors:
Nicoletta Calzolari, Frédéric Béchet, Philippe Blache, Khalid Choukri, Christopher Cieri, Thierry Declerck, Sara Goggi, Hitoshi Isahara, Bente Maegaard, Joseph Mariani, Hélène Mazo, Jan Odijk, Stelios Piperidis
Venue:
LREC
SIG:
Publisher:
European Language Resources Association
Note:
Pages:
6104–6113
Language:
URL:
https://aclanthology.org/2022.lrec-1.656
DOI:
Bibkey:
Cite (ACL):
Zhaoxin Luo and Michael Zhu. 2022. Recurrent Neural Networks with Mixed Hierarchical Structures and EM Algorithm for Natural Language Processing. In Proceedings of the Thirteenth Language Resources and Evaluation Conference, pages 6104–6113, Marseille, France. European Language Resources Association.
Cite (Informal):
Recurrent Neural Networks with Mixed Hierarchical Structures and EM Algorithm for Natural Language Processing (Luo & Zhu, LREC 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-1/2022.lrec-1.656.pdf