Joseph Marvin Imperial


2022

pdf
A Baseline Readability Model for Cebuano
Joseph Marvin Imperial | Lloyd Lois Antonie Reyes | Michael Antonio Ibanez | Ranz Sapinit | Mohammed Hussien
Proceedings of the 17th Workshop on Innovative Use of NLP for Building Educational Applications (BEA 2022)

In this study, we developed the first baseline readability model for the Cebuano language. Cebuano is the second most-used native language in the Philippines with about 27.5 million speakers. As the baseline, we extracted traditional or surface-based features, syllable patterns based from Cebuano’s documented orthography, and neural embeddings from the multilingual BERT model. Results show that the use of the first two handcrafted linguistic features obtained the best performance trained on an optimized Random Forest model with approximately 87% across all metrics. The feature sets and algorithm used also is similar to previous results in readability assessment for the Filipino language—showing potential of crosslingual application. To encourage more work for readability assessment in Philippine languages such as Cebuano, we open-sourced both code and data.

pdf
NU HLT at CMCL 2022 Shared Task: Multilingual and Crosslingual Prediction of Human Reading Behavior in Universal Language Space
Joseph Marvin Imperial
Proceedings of the Workshop on Cognitive Modeling and Computational Linguistics

In this paper, we present a unified model that works for both multilingual and crosslingual prediction of reading times of words in various languages. The secret behind the success of this model is in the preprocessing step where all words are transformed to their universal language representation via the International Phonetic Alphabet (IPA). To the best of our knowledge, this is the first study to favorably exploit this phonological property of language for the two tasks. Various feature types were extracted covering basic frequencies, n-grams, information theoretic, and psycholinguistically-motivated predictors for model training. A finetuned Random Forest model obtained best performance for both tasks with 3.8031 and 3.9065 MAE scores for mean first fixation duration (FFDAvg) and mean total reading time (TRTAvg) respectively.

2021

pdf
Under the Microscope: Interpreting Readability Assessment Models for Filipino
Joseph Marvin Imperial | Ethel Ong
Proceedings of the 35th Pacific Asia Conference on Language, Information and Computation

pdf
Science Mapping of Publications in Natural Language Processing in the Philippines: 2006 to 2020
Rachel Edita O. Roxas | Joseph Marvin Imperial | Angelica H. De La Cruz
Proceedings of the 35th Pacific Asia Conference on Language, Information and Computation

pdf
BERT Embeddings for Automatic Readability Assessment
Joseph Marvin Imperial
Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2021)

Automatic readability assessment (ARA) is the task of evaluating the level of ease or difficulty of text documents for a target audience. For researchers, one of the many open problems in the field is to make such models trained for the task show efficacy even for low-resource languages. In this study, we propose an alternative way of utilizing the information-rich embeddings of BERT models with handcrafted linguistic features through a combined method for readability assessment. Results show that the proposed method outperforms classical approaches in readability assessment using English and Filipino datasets, obtaining as high as 12.4% increase in F1 performance. We also show that the general information encoded in BERT embeddings can be used as a substitute feature set for low-resource languages like Filipino with limited semantic and syntactic NLP tools to explicitly extract feature values for the task.

2020

pdf
A Simple Disaster-Related Knowledge Base for Intelligent Agents
Clark Emmanuel Paulo | Arvin Ken Ramirez | David Clarence Reducindo | Rannie Mark Mateo | Joseph Marvin Imperial
Proceedings of the 34th Pacific Asia Conference on Language, Information and Computation