@inproceedings{venkateswaran-liu-2024-looking,
    title = "Looking within the self: Investigating the Impact of Data Augmentation with Self-training on Automatic Speech Recognition for {H}upa",
    author = "Venkateswaran, Nitin  and
      Liu, Zoey",
    editor = "Moeller, Sarah  and
      Agyapong, Godfred  and
      Arppe, Antti  and
      Chaudhary, Aditi  and
      Rijhwani, Shruti  and
      Cox, Christopher  and
      Henke, Ryan  and
      Palmer, Alexis  and
      Rosenblum, Daisy  and
      Schwartz, Lane",
    booktitle = "Proceedings of the Seventh Workshop on the Use of Computational Methods in the Study of Endangered Languages",
    month = mar,
    year = "2024",
    address = "St. Julians, Malta",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2024.computel-1.9/",
    pages = "58--66",
    abstract = "We investigate the performance of state-of-the-art neural ASR systems in transcribing audio recordings for Hupa, a critically endangered language of the Hoopa Valley Tribe. We also explore the impact on ASR performance when augmenting a small dataset of gold-standard high-quality transcriptions with a) a larger dataset with transcriptions of lower quality, and b) model-generated transcriptions in a self-training approach. An evaluation of both data augmentation approaches shows that the self-training approach is competitive, producing better WER scores than models trained with no additional data and not lagging far behind models trained with additional lower quality manual transcriptions instead: the deterioration in WER score is just 4.85 points when all the additional data is used in experiments with the best performing system, Wav2Vec. These findings have encouraging implications on the use of ASR systems for transcription and language documentation efforts in the Hupa language."
}Markdown (Informal)
[Looking within the self: Investigating the Impact of Data Augmentation with Self-training on Automatic Speech Recognition for Hupa](https://preview.aclanthology.org/ingest-emnlp/2024.computel-1.9/) (Venkateswaran & Liu, ComputEL 2024)
ACL