Generative Reviewer Agents: Scalable Simulacra of Peer Review

Nicolas Bougie, Narimawa Watanabe


Abstract
The peer review process is fundamental to scientific progress, determining which papers meet the quality standards for publication. Yet, the rapid growth of scholarly production and increasing specialization in knowledge areas strain traditional scientific feedback mechanisms. In light of this, we introduce Generative Agent Reviewers (GAR), leveraging LLM-empowered agents to simulate faithful peer reviewers. To enable generative reviewers, we design an architecture that extends a large language model with memory capabilities and equips agents with reviewer personas derived from historical data. Our experiments demonstrate that GAR performs comparably to human reviewers in providing detailed feedback and predicting paper outcomes. Beyond mere performance comparison, we conduct insightful experiments, such as evaluating the impact of reviewer expertise and examining fairness in reviews. By offering early expert-level feedback, typically restricted to a limited group of researchers, GAR democratizes access to transparent and in-depth evaluation.
Anthology ID:
2025.emnlp-industry.8
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing: Industry Track
Month:
November
Year:
2025
Address:
Suzhou (China)
Editors:
Saloni Potdar, Lina Rojas-Barahona, Sebastien Montella
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
98–116
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-industry.8/
DOI:
Bibkey:
Cite (ACL):
Nicolas Bougie and Narimawa Watanabe. 2025. Generative Reviewer Agents: Scalable Simulacra of Peer Review. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing: Industry Track, pages 98–116, Suzhou (China). Association for Computational Linguistics.
Cite (Informal):
Generative Reviewer Agents: Scalable Simulacra of Peer Review (Bougie & Watanabe, EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-industry.8.pdf