BLooP: Zero-Shot Abstractive Summarization Using Large Language Models with Bigram Lookahead Promotion

Varun Iyer, Cornelia Caragea


Abstract
Abstractive summarization requires models to generate summaries that convey information in the source document. While large language models can generate summaries without fine-tuning, they often miss key details and include extraneous information. We propose BLooP (Bigram Lookahead Promotion), a simple training-free decoding intervention that encourages large language models (LLMs) to generate tokens that form bigrams from the source document. BLooP operates through a hash table lookup at each decoding step, requiring no training, fine-tuning, or model modification. We demonstrate improvements in ROUGE and BARTScore for [Llama-3.1-8B-Instruct](https://huggingface.co/meta-llama/Llama-3.1-8B-Instruct), [Mistral-Nemo-Instruct-2407](https://huggingface.co/mistralai/Mistral-Nemo-Instruct-2407), and [Gemma-2-9B-IT](https://huggingface.co/google/gemma-2-9b-it) on CNN/DM, CCSum, Multi-News, and SciTLDR. Human evaluation shows that BLooP significantly improves faithfulness without reducing readability. We make the code available [here](https://github.com/varuniyer/BLooP).
Anthology ID:
2026.lrec-main.482
Volume:
Proceedings of the Fifteenth Language Resources and Evaluation Conference
Month:
May
Year:
2026
Address:
Palma de Mallorca, Spain
Editors:
Stelios Piperidis, Núria Bel, Henk van den Heuvel, Nancy Ide, Simon Krek, Antonio Toral
Venue:
LREC
SIG:
Publisher:
ELRA Language Resource Association
Note:
Pages:
6080–6102
Language:
URL:
https://preview.aclanthology.org/ingest-lrec/2026.lrec-main.482/
DOI:
Bibkey:
Cite (ACL):
Varun Iyer and Cornelia Caragea. 2026. BLooP: Zero-Shot Abstractive Summarization Using Large Language Models with Bigram Lookahead Promotion. International Conference on Language Resources and Evaluation, main:6080–6102.
Cite (Informal):
BLooP: Zero-Shot Abstractive Summarization Using Large Language Models with Bigram Lookahead Promotion (Iyer & Caragea, LREC 2026)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-lrec/2026.lrec-main.482.pdf
Optionalsupplementarymaterial:
 2026.lrec-main.482.OptionalSupplementaryMaterial.zip