Martin Ringsquandl


2022

pdf
Named Entity Recognition in Industrial Tables using Tabular Language Models
Aneta Koleva | Martin Ringsquandl | Mark Buckley | Rakeb Hasan | Volker Tresp
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing: Industry Track

Specialized transformer-based models for encoding tabular data have gained interest in academia. Although tabular data is omnipresent in industry, applications of table transformers are still missing.In this paper, we study how these models can be applied to an industrial Named Entity Recognition (NER) problem where the entities are mentioned in tabular-structured spreadsheets.The highly technical nature of spreadsheets as well as the lack of labeled data present major challenges for fine-tuning transformer-based models.Therefore, we develop a dedicated table data augmentation strategy based on available domain-specific knowledge graphs.We show that this boosts performance in our low-resource scenario considerably. Further, we investigate the benefits of tabular structure as inductive bias compared to tables as linearized sequences.Our experiments confirm that a table transformer outperforms other baselines and that its tabular inductive bias is vital for convergence of transformer-based models.