Avoidance Decoding for Diverse Multi-Branch Story Generation

Kyeongman Park, Nakyeong Yang, Kyomin Jung


Abstract
Large Language Models (LLMs) often generate repetitive and monotonous outputs, especially in tasks like story generation, due to limited creative diversity when given the same input prompt. To address this challenge, we propose a novel decoding strategy, ***Avoidance Decoding***, that modifies token logits by penalizing similarity to previously generated outputs, thereby encouraging more diverse multi-branch stories. This penalty adaptively balances two similarity measures: (1) Concept-level Similarity Penalty, which is prioritized in early stages to diversify initial story concepts, and (2) Narrative-level Similarity Penalty, which is increasingly emphasized later to ensure natural yet diverse plot development. Notably, our method achieves up to **2.6** times higher output diversity and reduces repetition by an average of 30% compared to strong baselines, while effectively mitigating text degeneration. Furthermore, we reveal that our method activates a broader range of neurons, demonstrating that it leverages the model’s intrinsic creative capacity.
Anthology ID:
2025.emnlp-main.381
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7500–7516
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.381/
DOI:
Bibkey:
Cite (ACL):
Kyeongman Park, Nakyeong Yang, and Kyomin Jung. 2025. Avoidance Decoding for Diverse Multi-Branch Story Generation. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 7500–7516, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Avoidance Decoding for Diverse Multi-Branch Story Generation (Park et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.381.pdf
Checklist:
 2025.emnlp-main.381.checklist.pdf