Analysis of Resource-efficient Predictive Models for Natural Language Processing

Raj Pranesh, Ambesh Shekhar


Abstract
In this paper, we presented an analyses of the resource efficient predictive models, namely Bonsai, Binary Neighbor Compression(BNC), ProtoNN, Random Forest, Naive Bayes and Support vector machine(SVM), in the machine learning field for resource constraint devices. These models try to minimize resource requirements like RAM and storage without hurting the accuracy much. We utilized these models on multiple benchmark natural language processing tasks, which were sentimental analysis, spam message detection, emotion analysis and fake news classification. The experiment results shows that the tree-based algorithm, Bonsai, surpassed the rest of the machine learning algorithms by achieve higher accuracy scores while having significantly lower memory usage.
Anthology ID:
2020.sustainlp-1.18
Volume:
Proceedings of SustaiNLP: Workshop on Simple and Efficient Natural Language Processing
Month:
November
Year:
2020
Address:
Online
Venue:
sustainlp
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
136–140
Language:
URL:
https://aclanthology.org/2020.sustainlp-1.18
DOI:
10.18653/v1/2020.sustainlp-1.18
Bibkey:
Cite (ACL):
Raj Pranesh and Ambesh Shekhar. 2020. Analysis of Resource-efficient Predictive Models for Natural Language Processing. In Proceedings of SustaiNLP: Workshop on Simple and Efficient Natural Language Processing, pages 136–140, Online. Association for Computational Linguistics.
Cite (Informal):
Analysis of Resource-efficient Predictive Models for Natural Language Processing (Pranesh & Shekhar, sustainlp 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/paclic-22-ingestion/2020.sustainlp-1.18.pdf
Video:
 https://slideslive.com/38939440