Beyond Paragraphs: NLP for Long Sequences
Iz Beltagy, Arman Cohan, Hannaneh Hajishirzi, Sewon Min, Matthew E. Peters
Abstract
In this tutorial, we aim at bringing interested NLP researchers up to speed about the recent and ongoing techniques for document-level representation learning. Additionally, our goal is to reveal new research opportunities to the audience, which will hopefully bring us closer to address existing challenges in this domain.- Anthology ID:
- 2021.naacl-tutorials.5
- Volume:
- Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Tutorials
- Month:
- June
- Year:
- 2021
- Address:
- Online
- Editors:
- Greg Kondrak, Kalina Bontcheva, Dan Gillick
- Venue:
- NAACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 20–24
- Language:
- URL:
- https://aclanthology.org/2021.naacl-tutorials.5
- DOI:
- 10.18653/v1/2021.naacl-tutorials.5
- Cite (ACL):
- Iz Beltagy, Arman Cohan, Hannaneh Hajishirzi, Sewon Min, and Matthew E. Peters. 2021. Beyond Paragraphs: NLP for Long Sequences. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Tutorials, pages 20–24, Online. Association for Computational Linguistics.
- Cite (Informal):
- Beyond Paragraphs: NLP for Long Sequences (Beltagy et al., NAACL 2021)
- PDF:
- https://preview.aclanthology.org/improve-issue-templates/2021.naacl-tutorials.5.pdf
- Code
- allenai/naacl2021-longdoc-tutorial