Routing Enforced Generative Model for Recipe Generation

Zhiwei Yu, Hongyu Zang, Xiaojun Wan


Abstract
One of the most challenging part of recipe generation is to deal with the complex restrictions among the input ingredients. Previous researches simplify the problem by treating the inputs independently and generating recipes containing as much information as possible. In this work, we propose a routing method to dive into the content selection under the internal restrictions. The routing enforced generative model (RGM) can generate appropriate recipes according to the given ingredients and user preferences. Our model yields new state-of-the-art results on the recipe generation task with significant improvements on BLEU, F1 and human evaluation.
Anthology ID:
2020.emnlp-main.311
Volume:
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Month:
November
Year:
2020
Address:
Online
Editors:
Bonnie Webber, Trevor Cohn, Yulan He, Yang Liu
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3797–3806
Language:
URL:
https://aclanthology.org/2020.emnlp-main.311
DOI:
10.18653/v1/2020.emnlp-main.311
Bibkey:
Cite (ACL):
Zhiwei Yu, Hongyu Zang, and Xiaojun Wan. 2020. Routing Enforced Generative Model for Recipe Generation. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 3797–3806, Online. Association for Computational Linguistics.
Cite (Informal):
Routing Enforced Generative Model for Recipe Generation (Yu et al., EMNLP 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-bitext-workshop/2020.emnlp-main.311.pdf
Video:
 https://slideslive.com/38939193