BERT-Based Neural Collaborative Filtering and Fixed-Length Contiguous Tokens Explanation

Reinald Adrian Pugoy, Hung-Yu Kao


Abstract
We propose a novel, accurate, and explainable recommender model (BENEFICT) that addresses two drawbacks that most review-based recommender systems face. First is their utilization of traditional word embeddings that could influence prediction performance due to their inability to model the word semantics’ dynamic characteristic. Second is their black-box nature that makes the explanations behind every prediction obscure. Our model uniquely integrates three key elements: BERT, multilayer perceptron, and maximum subarray problem to derive contextualized review features, model user-item interactions, and generate explanations, respectively. Our experiments show that BENEFICT consistently outperforms other state-of-the-art models by an average improvement gain of nearly 7%. Based on the human judges’ assessment, the BENEFICT-produced explanations can capture the essence of the customer’s preference and help future customers make purchasing decisions. To the best of our knowledge, our model is one of the first recommender models to utilize BERT for neural collaborative filtering.
Anthology ID:
2020.aacl-main.18
Volume:
Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing
Month:
December
Year:
2020
Address:
Suzhou, China
Venue:
AACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
143–153
Language:
URL:
https://aclanthology.org/2020.aacl-main.18
DOI:
Bibkey:
Cite (ACL):
Reinald Adrian Pugoy and Hung-Yu Kao. 2020. BERT-Based Neural Collaborative Filtering and Fixed-Length Contiguous Tokens Explanation. In Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing, pages 143–153, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
BERT-Based Neural Collaborative Filtering and Fixed-Length Contiguous Tokens Explanation (Pugoy & Kao, AACL 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2020.aacl-main.18.pdf