Assessing the ability of Transformer-based Neural Models to represent structurally unbounded dependencies

Jillian Da Costa, Rui Chaves


Anthology ID:
2020.scil-1.2
Volume:
Proceedings of the Society for Computation in Linguistics 2020
Month:
January
Year:
2020
Address:
New York, New York
Venue:
SCiL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12–21
Language:
URL:
https://aclanthology.org/2020.scil-1.2
DOI:
Bibkey:
Cite (ACL):
Jillian Da Costa and Rui Chaves. 2020. Assessing the ability of Transformer-based Neural Models to represent structurally unbounded dependencies. In Proceedings of the Society for Computation in Linguistics 2020, pages 12–21, New York, New York. Association for Computational Linguistics.
Cite (Informal):
Assessing the ability of Transformer-based Neural Models to represent structurally unbounded dependencies (Da Costa & Chaves, SCiL 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/update-css-js/2020.scil-1.2.pdf
Code
 ruipchaves/transformers-fillergap-dependencies
Data
Billion Word Benchmark