On the Evaluation of Vision-and-Language Navigation Instructions
Ming Zhao, Peter Anderson, Vihan Jain, Su Wang, Alexander Ku, Jason Baldridge, Eugene Ie
Abstract
Vision-and-Language Navigation wayfinding agents can be enhanced by exploiting automatically generated navigation instructions. However, existing instruction generators have not been comprehensively evaluated, and the automatic evaluation metrics used to develop them have not been validated. Using human wayfinders, we show that these generators perform on par with or only slightly better than a template-based generator and far worse than human instructors. Furthermore, we discover that BLEU, ROUGE, METEOR and CIDEr are ineffective for evaluating grounded navigation instructions. To improve instruction evaluation, we propose an instruction-trajectory compatibility model that operates without reference instructions. Our model shows the highest correlation with human wayfinding outcomes when scoring individual instructions. For ranking instruction generation systems, if reference instructions are available we recommend using SPICE.- Anthology ID:
- 2021.eacl-main.111
- Volume:
- Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume
- Month:
- April
- Year:
- 2021
- Address:
- Online
- Venue:
- EACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 1302–1316
- Language:
- URL:
- https://aclanthology.org/2021.eacl-main.111
- DOI:
- 10.18653/v1/2021.eacl-main.111
- Cite (ACL):
- Ming Zhao, Peter Anderson, Vihan Jain, Su Wang, Alexander Ku, Jason Baldridge, and Eugene Ie. 2021. On the Evaluation of Vision-and-Language Navigation Instructions. In Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume, pages 1302–1316, Online. Association for Computational Linguistics.
- Cite (Informal):
- On the Evaluation of Vision-and-Language Navigation Instructions (Zhao et al., EACL 2021)
- PDF:
- https://preview.aclanthology.org/remove-xml-comments/2021.eacl-main.111.pdf
- Data
- COCO Captions, RxR