eduzhai > Applied Sciences > Engineering >

Regularized Forward-Backward Decoder for Attention Models

  • Save

... pages left unread,continue reading

Document pages: 5 pages

Abstract: Nowadays, attention models are one of the popular candidates for speechrecognition. So far, many studies mainly focus on the encoder structure or theattention module to enhance the performance of these models. However, mostlyignore the decoder. In this paper, we propose a novel regularization techniqueincorporating a second decoder during the training phase. This decoder isoptimized on time-reversed target labels beforehand and supports the standarddecoder during training by adding knowledge from future context. Since it isonly added during training, we are not changing the basic structure of thenetwork or adding complexity during decoding. We evaluate our approach on thesmaller TEDLIUMv2 and the larger LibriSpeech dataset, achieving consistentimprovements on both of them.

Please select stars to rate!

         

0 comments Sign in to leave a comment.

    Data loading, please wait...
×