eduzhai > Applied Sciences > Engineering >

A Study on Effects of Implicit and Explicit Language Model Information for DBLSTM-CTC Based Handwriting Recognition

  • king
  • (0) Download
  • 20210507
  • Save

... pages left unread,continue reading

Document pages: 5 pages

Abstract: Deep Bidirectional Long Short-Term Memory (D-BLSTM) with a ConnectionistTemporal Classification (CTC) output layer has been established as one of thestate-of-the-art solutions for handwriting recognition. It is well known thatthe DBLSTM trained by using a CTC objective function will learn both localcharacter image dependency for character modeling and long-range contextualdependency for implicit language modeling. In this paper, we study the effectsof implicit and explicit language model information for DBLSTM-CTC basedhandwriting recognition by comparing the performance of using or without usingan explicit language model in decoding. It is observed that even using onemillion lines of training sentences to train the DBLSTM, using an explicitlanguage model is still helpful. To deal with such a large-scale trainingproblem, a GPU-based training tool has been developed for CTC training ofDBLSTM by using a mini-batch based epochwise Back Propagation Through Time(BPTT) algorithm.

Please select stars to rate!

         

0 comments Sign in to leave a comment.

    Data loading, please wait...
×