@inproceedings{liza-grzes-2019-relating,
    title = "Relating {RNN} Layers with the Spectral {WFA} Ranks in Sequence Modelling",
    author = "Liza, Farhana Ferdousi  and
      Grzes, Marek",
    editor = "Eisner, Jason  and
      Gall{\'e}, Matthias  and
      Heinz, Jeffrey  and
      Quattoni, Ariadna  and
      Rabusseau, Guillaume",
    booktitle = "Proceedings of the Workshop on Deep Learning and Formal Languages: Building Bridges",
    month = aug,
    year = "2019",
    address = "Florence",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/W19-3903/",
    doi = "10.18653/v1/W19-3903",
    pages = "24--33",
    abstract = "We analyse Recurrent Neural Networks (RNNs) to understand the significance of multiple LSTM layers. We argue that the Weighted Finite-state Automata (WFA) trained using a spectral learning algorithm are helpful to analyse RNNs. Our results suggest that multiple LSTM layers in RNNs help learning distributed hidden states, but have a smaller impact on the ability to learn long-term dependencies. The analysis is based on the empirical results, however relevant theory (whenever possible) was discussed to justify and support our conclusions."
}Markdown (Informal)
[Relating RNN Layers with the Spectral WFA Ranks in Sequence Modelling](https://preview.aclanthology.org/ingest-emnlp/W19-3903/) (Liza & Grzes, ACL 2019)
ACL