@inproceedings{kamali-kordjamshidi-2023-syntax,
    title = "Syntax-Guided Transformers: Elevating Compositional Generalization and Grounding in Multimodal Environments",
    author = "Kamali, Danial  and
      Kordjamshidi, Parisa",
    editor = "Hupkes, Dieuwke  and
      Dankers, Verna  and
      Batsuren, Khuyagbaatar  and
      Sinha, Koustuv  and
      Kazemnejad, Amirhossein  and
      Christodoulopoulos, Christos  and
      Cotterell, Ryan  and
      Bruni, Elia",
    booktitle = "Proceedings of the 1st GenBench Workshop on (Benchmarking) Generalisation in NLP",
    month = dec,
    year = "2023",
    address = "Singapore",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2023.genbench-1.10/",
    doi = "10.18653/v1/2023.genbench-1.10",
    pages = "130--142",
    abstract = "Compositional generalization, the ability of intelligent models to extrapolate understanding of components to novel compositions, is a fundamental yet challenging facet in AI research, especially within multimodal environments. In this work, we address this challenge by exploiting the syntactic structure of language to boost compositional generalization. This paper elevates the importance of syntactic grounding, particularly through attention masking techniques derived from text input parsing. We introduce and evaluate the merits of using syntactic information in the multimodal grounding problem. Our results on grounded compositional generalization underscore the positive impact of dependency parsing across diverse tasks when utilized with Weight Sharing across the Transformer encoder. The results push the state-of-the-art in multimodal grounding and parameter-efficient modeling and provide insights for future research."
}Markdown (Informal)
[Syntax-Guided Transformers: Elevating Compositional Generalization and Grounding in Multimodal Environments](https://preview.aclanthology.org/ingest-emnlp/2023.genbench-1.10/) (Kamali & Kordjamshidi, GenBench 2023)
ACL