Alec Marantz


2023

pdf
BERT Shows Garden Path Effects
Tovah Irwin | Kyra Wilson | Alec Marantz
Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics

Garden path sentences (i.e. “the horse raced past the barn fell”) are sentences that readers initially incorrectly parse, requiring partial or total re-analysis of the sentence structure. Given human difficulty in parsing garden paths, we aim to compare transformer language models’ performance on these sentences. We assess a selection of models from the BERT family which have been fine-tuned on the question-answering task, and evaluate each model’s performance on comprehension questions based on garden path and control sentences. We then further investigate the semantic roles assigned to arguments of verbs in garden path and control sentences by utilizing a probe task to directly assess which semantic role(s) the model assigns. We find that the models have relatively low performance in certain instances of question answering based on garden path contexts, and the model incorrectly assigns semantic roles, aligning for the most part with human performance.

2022

pdf
Contextual Embeddings Can Distinguish Homonymy from Polysemy in a Human-Like Way
Kyra Wilson | Alec Marantz
Proceedings of the 5th International Conference on Natural Language and Speech Processing (ICNLSP 2022)

2020

pdf
Modeling morphological processing in human magnetoencephalography
Yohei Oseki | Alec Marantz
Proceedings of the Society for Computation in Linguistics 2020

2019

pdf
Modeling Hierarchical Syntactic Structures in Morphological Processing
Yohei Oseki | Charles Yang | Alec Marantz
Proceedings of the Workshop on Cognitive Modeling and Computational Linguistics

Sentences are represented as hierarchical syntactic structures, which have been successfully modeled in sentence processing. In contrast, despite the theoretical agreement on hierarchical syntactic structures within words, words have been argued to be computationally less complex than sentences and implemented by finite-state models as linear strings of morphemes, and even the psychological reality of morphemes has been denied. In this paper, extending the computational models employed in sentence processing to morphological processing, we performed a computational simulation experiment where, given incremental surprisal as a linking hypothesis, five computational models with different representational assumptions were evaluated against human reaction times in visual lexical decision experiments available from the English Lexicon Project (ELP), a “shared task” in the morphological processing literature. The simulation experiment demonstrated that (i) “amorphous” models without morpheme units underperformed relative to “morphous” models, (ii) a computational model with hierarchical syntactic structures, Probabilistic Context-Free Grammar (PCFG), most accurately explained human reaction times, and (iii) this performance was achieved on top of surface frequency effects. These results strongly suggest that morphological processing tracks morphemes incrementally from left to right and parses them into hierarchical syntactic structures, contrary to “amorphous” and finite-state models of morphological processing.

pdf
Inverting and Modeling Morphological Inflection
Yohei Oseki | Yasutada Sudo | Hiromu Sakai | Alec Marantz
Proceedings of the 16th Workshop on Computational Research in Phonetics, Phonology, and Morphology

Previous “wug” tests (Berko, 1958) on Japanese verbal inflection have demonstrated that Japanese speakers, both adults and children, cannot inflect novel present tense forms to “correct” past tense forms predicted by rules of existent verbs (de Chene, 1982; Vance, 1987, 1991; Klafehn, 2003, 2013), indicating that Japanese verbs are merely stored in the mental lexicon. However, the implicit assumption that present tense forms are bases for verbal inflection should not be blindly extended to morphologically rich languages like Japanese in which both present and past tense forms are morphologically complex without inherent direction (Albright, 2002). Interestingly, there are also independent observations in the acquisition literature to suggest that past tense forms may be bases for verbal inflection in Japanese (Klafehn, 2003; Murasugi et al., 2010; Hirose, 2017; Tatsumi et al., 2018). In this paper, we computationally simulate two directions of verbal inflection in Japanese, Present → Past and Past → Present, with the rule-based computational model called Minimal Generalization Learner (MGL; Albright and Hayes, 2003) and experimentally evaluate the model with the bidirectional “wug” test where humans inflect novel verbs in two opposite directions. We conclude that Japanese verbs can be computed online via some generalizations and those generalizations do depend on the direction of morphological inflection.

2018

pdf
Phonological (un)certainty weights lexical activation
Laura Gwilliams | David Poeppel | Alec Marantz | Tal Linzen
Proceedings of the 8th Workshop on Cognitive Modeling and Computational Linguistics (CMCL 2018)