Abstract
Structured data, prevalent in tables, databases, and knowledge graphs, poses a significant challenge in its representation. With the advent of large language models (LLMs), there has been a shift towards linearization-based methods, which process structured data as sequential token streams, diverging from approaches that explicitly model structure, often as a graph. Crucially, there remains a gap in our understanding of how these linearization-based methods handle structured data, which is inherently non-linear.This work investigates the linear handling of structured data in encoder-decoder language models, specifically T5. Our findings reveal the model’s ability to mimic human-designed processes such as schema linking and syntax prediction, indicating a deep, meaningful learning of structure beyond simple token sequencing. We also uncover insights into the model’s internal mechanisms, including the ego-centric nature of structure node encodings and the potential for model compression due to modality fusion redundancy. Overall, this work sheds light on the inner workings of linearization-based methods and could potentially provide guidance for future research.- Anthology ID:
- 2024.naacl-long.8
- Volume:
- Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers)
- Month:
- June
- Year:
- 2024
- Address:
- Mexico City, Mexico
- Editors:
- Kevin Duh, Helena Gomez, Steven Bethard
- Venue:
- NAACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 131–156
- Language:
- URL:
- https://preview.aclanthology.org/add_missing_videos/2024.naacl-long.8/
- DOI:
- 10.18653/v1/2024.naacl-long.8
- Cite (ACL):
- Yutong Shao and Ndapa Nakashole. 2024. On Linearizing Structured Data in Encoder-Decoder Language Models: Insights from Text-to-SQL. In Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers), pages 131–156, Mexico City, Mexico. Association for Computational Linguistics.
- Cite (Informal):
- On Linearizing Structured Data in Encoder-Decoder Language Models: Insights from Text-to-SQL (Shao & Nakashole, NAACL 2024)
- PDF:
- https://preview.aclanthology.org/add_missing_videos/2024.naacl-long.8.pdf