Skim-Attention: Learning to Focus via Document Layout

Laura Nguyen, Thomas Scialom, Jacopo Staiano, Benjamin Piwowarski


Abstract
Transformer-based pre-training techniques of text and layout have proven effective in a number of document understanding tasks. Despite this success, multimodal pre-training models suffer from very high computational and memory costs. Motivated by human reading strategies, this paper presents Skim-Attention, a new attention mechanism that takes advantage of the structure of the document and its layout. Skim-Attention only attends to the 2-dimensional position of the words in a document. Our experiments show that Skim-Attention obtains a lower perplexity than prior works, while being more computationally efficient. Skim-Attention can be further combined with long-range Transformers to efficiently process long documents. We also show how Skim-Attention can be used off-the-shelf as a mask for any Pre-trained Language Model, allowing to improve their performance while restricting attention. Finally, we show the emergence of a document structure representation in Skim-Attention.
Anthology ID:
2021.findings-emnlp.207
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2021
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
Findings
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
2413–2427
Language:
URL:
https://aclanthology.org/2021.findings-emnlp.207
DOI:
10.18653/v1/2021.findings-emnlp.207
Bibkey:
Cite (ACL):
Laura Nguyen, Thomas Scialom, Jacopo Staiano, and Benjamin Piwowarski. 2021. Skim-Attention: Learning to Focus via Document Layout. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 2413–2427, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Skim-Attention: Learning to Focus via Document Layout (Nguyen et al., Findings 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl-24-ws-corrections/2021.findings-emnlp.207.pdf
Software:
 2021.findings-emnlp.207.Software.zip
Video:
 https://preview.aclanthology.org/naacl-24-ws-corrections/2021.findings-emnlp.207.mp4
Code
 recitalai/skim-attention