Tatsuya Watanabe


2014

pdf
Representing Multimodal Linguistic Annotated data
Brigitte Bigi | Tatsuya Watanabe | Laurent Prévot
Proceedings of the Ninth International Conference on Language Resources and Evaluation (LREC'14)

The question of interoperability for linguistic annotated resources covers different aspects. First, it requires a representation framework making it possible to compare, and eventually merge, different annotation schema. In this paper, a general description level representing the multimodal linguistic annotations is proposed. It focuses on time representation and on the data content representation: This paper reconsiders and enhances the current and generalized representation of annotations. An XML schema of such annotations is proposed. A Python API is also proposed. This framework is implemented in a multi-platform software and distributed under the terms of the GNU Public License.

pdf
Extracting multi-annotated speech data (Extraction de données orales multi-annotées) [in French]
Brigitte Bigi | Tatsuya Watanabe
Proceedings of TALN 2014 (Volume 2: Short Papers)