Abstract
There is a huge performance gap between formal and informal language understanding tasks. The recent pre-trained models that improved formal language understanding tasks did not achieve a comparable result on informal language. We propose data annealing transfer learning procedure to bridge the performance gap on informal natural language understanding tasks. It successfully utilizes a pre-trained model such as BERT in informal language. In the data annealing procedure, the training set contains mainly formal text data at first; then, the proportion of the informal text data is gradually increased during the training process. Our data annealing procedure is model-independent and can be applied to various tasks. We validate its effectiveness in exhaustive experiments. When BERT is implemented with our learning procedure, it outperforms all the state-of-the-art models on the three common informal language tasks.- Anthology ID:
- 2020.findings-emnlp.282
- Volume:
- Findings of the Association for Computational Linguistics: EMNLP 2020
- Month:
- November
- Year:
- 2020
- Address:
- Online
- Editors:
- Trevor Cohn, Yulan He, Yang Liu
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 3153–3159
- Language:
- URL:
- https://aclanthology.org/2020.findings-emnlp.282
- DOI:
- 10.18653/v1/2020.findings-emnlp.282
- Cite (ACL):
- Jing Gu and Zhou Yu. 2020. Data Annealing for Informal Language Understanding Tasks. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 3153–3159, Online. Association for Computational Linguistics.
- Cite (Informal):
- Data Annealing for Informal Language Understanding Tasks (Gu & Yu, Findings 2020)
- PDF:
- https://preview.aclanthology.org/naacl24-info/2020.findings-emnlp.282.pdf