@inproceedings{park-etal-2025-avoidance,
    title = "Avoidance Decoding for Diverse Multi-Branch Story Generation",
    author = "Park, Kyeongman  and
      Yang, Nakyeong  and
      Jung, Kyomin",
    editor = "Christodoulopoulos, Christos  and
      Chakraborty, Tanmoy  and
      Rose, Carolyn  and
      Peng, Violet",
    booktitle = "Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing",
    month = nov,
    year = "2025",
    address = "Suzhou, China",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.381/",
    pages = "7500--7516",
    ISBN = "979-8-89176-332-6",
    abstract = "Large Language Models (LLMs) often generate repetitive and monotonous outputs, especially in tasks like story generation, due to limited creative diversity when given the same input prompt. To address this challenge, we propose a novel decoding strategy, ***Avoidance Decoding***, that modifies token logits by penalizing similarity to previously generated outputs, thereby encouraging more diverse multi-branch stories. This penalty adaptively balances two similarity measures: (1) Concept-level Similarity Penalty, which is prioritized in early stages to diversify initial story concepts, and (2) Narrative-level Similarity Penalty, which is increasingly emphasized later to ensure natural yet diverse plot development. Notably, our method achieves up to **2.6** times higher output diversity and reduces repetition by an average of 30{\%} compared to strong baselines, while effectively mitigating text degeneration. Furthermore, we reveal that our method activates a broader range of neurons, demonstrating that it leverages the model{'}s intrinsic creative capacity."
}Markdown (Informal)
[Avoidance Decoding for Diverse Multi-Branch Story Generation](https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.381/) (Park et al., EMNLP 2025)
ACL