eduzhai > Applied Sciences > Engineering >

Context Dependent RNNLM for Automatic Transcription of Conversations

  • king
  • (0) Download
  • 20210506
  • Save

... pages left unread,continue reading

Document pages: 5 pages

Abstract: Conversational speech, while being unstructured at an utterance level,typically has a macro topic which provides larger context spanning multipleutterances. The current language models in speech recognition systems usingrecurrent neural networks (RNNLM) rely mainly on the local context and excludethe larger context. In order to model the long term dependencies of wordsacross multiple sentences, we propose a novel architecture where the words fromprior utterances are converted to an embedding. The relevance of theseembeddings for the prediction of next word in the current sentence is foundusing a gating network. The relevance weighted context embedding vector iscombined in the language model to improve the next word prediction, and theentire model including the context embedding and the relevance weighting layersis jointly learned for a conversational language modeling task. Experiments areperformed on two conversational datasets - AMI corpus and the Switchboardcorpus. In these tasks, we illustrate that the proposed approach yieldssignificant improvements in language model perplexity over the RNNLM baseline.In addition, the use of proposed conversational LM for ASR rescoring results inabsolute WER reduction of $1.2$ on Switchboard dataset and $1.0$ on AMIdataset over the RNNLM based ASR baseline.

Please select stars to rate!

         

0 comments Sign in to leave a comment.

    Data loading, please wait...
×