Abstract
Modeling sequence data using probabilistic finite state machines (PFSMs) is a technique that analyzes the underlying dynamics in sequences of symbols. Hidden semi-Markov models (HSMMs) and hierarchical hidden Markov models (HHMMs) are PFSMs that have been successfully applied to a wide variety of applications by extending HMMs to make the extracted patterns easier to interpret. However, these models are independently developed with their own training algorithm, so that we cannot combine multiple kinds of structures to build a PFSM for a specific application. In this paper, we prove that silent hidden Markov models (silent HMMs) are flexible models that have more expressive power than HSMMs and HHMMs. Silent HMMs are HMMs that contain silent states, which do not emit any observations. We show that we can obtain silent HMM equivalent to given HSMMs and HHMMs. We believe that these results form a firm foundation to use silent HMMs as a unified representation for PFSM modeling.- Anthology ID:
- W19-3113
- Volume:
- Proceedings of the 14th International Conference on Finite-State Methods and Natural Language Processing
- Month:
- September
- Year:
- 2019
- Address:
- Dresden, Germany
- Editors:
- Heiko Vogler, Andreas Maletti
- Venue:
- FSMNLP
- SIG:
- SIGFSM
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 98–107
- Language:
- URL:
- https://aclanthology.org/W19-3113
- DOI:
- 10.18653/v1/W19-3113
- Cite (ACL):
- Kei Wakabayashi. 2019. Silent HMMs: Generalized Representation of Hidden Semi-Markov Models and Hierarchical HMMs. In Proceedings of the 14th International Conference on Finite-State Methods and Natural Language Processing, pages 98–107, Dresden, Germany. Association for Computational Linguistics.
- Cite (Informal):
- Silent HMMs: Generalized Representation of Hidden Semi-Markov Models and Hierarchical HMMs (Wakabayashi, FSMNLP 2019)
- PDF:
- https://preview.aclanthology.org/dois-2013-emnlp/W19-3113.pdf