A Gentle Introduction to Deep Nets and Opportunities for the Future
Kenneth Church, Valia Kordoni, Gary Marcus, Ernest Davis, Yanjun Ma, Zeyu Chen
Abstract
The first half of this tutorial will make deep nets more accessible to a broader audience, following “Deep Nets for Poets” and “A Gentle Introduction to Fine-Tuning.” We will also introduce GFT (general fine tuning), a little language for fine tuning deep nets with short (one line) programs that are as easy to code as regression in statistics packages such as R using glm (general linear models). Based on the success of these methods on a number of benchmarks, one might come away with the impression that deep nets are all we need. However, we believe the glass is half-full: while there is much that can be done with deep nets, there is always more to do. The second half of this tutorial will discuss some of these opportunities.- Anthology ID:
- 2022.acl-tutorials.1
- Volume:
- Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics: Tutorial Abstracts
- Month:
- May
- Year:
- 2022
- Address:
- Dublin, Ireland
- Editors:
- Luciana Benotti, Naoaki Okazaki, Yves Scherrer, Marcos Zampieri
- Venue:
- ACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 1–6
- Language:
- URL:
- https://aclanthology.org/2022.acl-tutorials.1
- DOI:
- 10.18653/v1/2022.acl-tutorials.1
- Cite (ACL):
- Kenneth Church, Valia Kordoni, Gary Marcus, Ernest Davis, Yanjun Ma, and Zeyu Chen. 2022. A Gentle Introduction to Deep Nets and Opportunities for the Future. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics: Tutorial Abstracts, pages 1–6, Dublin, Ireland. Association for Computational Linguistics.
- Cite (Informal):
- A Gentle Introduction to Deep Nets and Opportunities for the Future (Church et al., ACL 2022)
- PDF:
- https://preview.aclanthology.org/add_acl24_videos/2022.acl-tutorials.1.pdf
- Data
- GLUE