https://scholars.lib.ntu.edu.tw/handle/123456789/607150
Title: | Towards lifelong learning of end-to-end ASR | Authors: | Chang H.-J Lee H.-Y HUNG-YI LEE LIN-SHAN LEE |
Keywords: | Continual learning;End-to-end automatic speech recognition;lifelong learning;Speech communication;Acoustic conditions;Application environment;Automatic speech recognition;Automatic Speech Recognition Technology;End to end;Life long learning;Performance;Real-world;Speech recognition | Issue Date: | 2021 | Journal Volume: | 2 | Start page/Pages: | 1306-1310 | Source: | Proceedings of the Annual Conference of the International Speech Communication Association, INTERSPEECH | Abstract: | Automatic speech recognition (ASR) technologies today are primarily optimized for given datasets; thus, any changes in the application environment (e.g., acoustic conditions or topic domains) may inevitably degrade the performance. We can collect new data describing the new environment and fine-tune the system, but this naturally leads to higher error rates for the earlier datasets, referred to as catastrophic forgetting. The concept of lifelong learning (LLL) aiming to enable a machine to sequentially learn new tasks from new datasets describing the changing real world without forgetting the previously learned knowledge is thus brought to attention. This paper reports, to our knowledge, the first effort to extensively consider and analyze the use of various approaches of LLL in end-to-end (E2E) ASR, including proposing novel methods in saving data for past domains to mitigate the catastrophic forgetting problem. An overall relative reduction of 28.7% in WER was achieved compared to the fine-tuning baseline when sequentially learning on three very different benchmark corpora. This can be the first step toward the highly desired ASR technologies capable of synchronizing with the continuously changing real world. Copyright ? 2021 ISCA. |
URI: | https://www.scopus.com/inward/record.uri?eid=2-s2.0-85119176339&doi=10.21437%2fInterspeech.2021-563&partnerID=40&md5=7bf3770b54d56baf0f23b0fcbff38f30 https://scholars.lib.ntu.edu.tw/handle/123456789/607150 |
ISSN: | 2308457X | DOI: | 10.21437/Interspeech.2021-563 |
Appears in Collections: | 電機工程學系 |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.