Towards a new research agenda for multimodal enterprise document understanding: What are we missing?

Armineh Nourbakhsh, Sameena Shah, Carolyn Rose


Abstract
The field of multimodal document understanding has produced a suite of models that have achieved stellar performance across several tasks, even coming close to human performance on certain benchmarks. Nevertheless, the application of these models to real-world enterprise datasets remains constrained by a number of limitations. In this position paper, we discuss these limitations in the context of three key aspects of research: dataset curation, model development, and evaluation on downstream tasks. By analyzing 14 datasets and 7 SotA models, we identify major gaps in their utility in the context of a real-world scenario. We demonstrate how each limitation impedes the widespread use of SotA models in enterprise settings, and present a set of research challenges that are motivated by these limitations. Lastly, we propose a research agenda that is aimed at driving the field towards higher impact in enterprise applications.
Anthology ID:
2024.findings-acl.870
Volume:
Findings of the Association for Computational Linguistics ACL 2024
Month:
August
Year:
2024
Address:
Bangkok, Thailand and virtual meeting
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14610–14622
Language:
URL:
https://aclanthology.org/2024.findings-acl.870
DOI:
10.18653/v1/2024.findings-acl.870
Bibkey:
Cite (ACL):
Armineh Nourbakhsh, Sameena Shah, and Carolyn Rose. 2024. Towards a new research agenda for multimodal enterprise document understanding: What are we missing?. In Findings of the Association for Computational Linguistics ACL 2024, pages 14610–14622, Bangkok, Thailand and virtual meeting. Association for Computational Linguistics.
Cite (Informal):
Towards a new research agenda for multimodal enterprise document understanding: What are we missing? (Nourbakhsh et al., Findings 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-5/2024.findings-acl.870.pdf