Fine-tuning Handwriting Recognition systems with Temporal Dropout

01/31/2021
by   Edgard Chammas, et al.
0

This paper introduces a novel method to fine-tune handwriting recognition systems based on Recurrent Neural Networks (RNN). Long Short-Term Memory (LSTM) networks are good at modeling long sequences but they tend to overfit over time. To improve the system's ability to model sequences, we propose to drop information at random positions in the sequence. We call our approach Temporal Dropout (TD). We apply TD at the image level as well to internal network representation. We show that TD improves the results on two different datasets. Our method outperforms previous state-of-the-art on Rodrigo dataset.

READ FULL TEXT
research
11/06/2018

Bidirectional Quaternion Long-Short Term Memory Recurrent Neural Networks for Speech Recognition

Recurrent neural networks (RNN) are at the core of modern automatic spee...
research
04/18/2019

Language Modeling through Long Term Memory Network

Recurrent Neural Networks (RNN), Long Short-Term Memory Networks (LSTM),...
research
11/05/2013

Dropout improves Recurrent Neural Networks for Handwriting Recognition

Recurrent neural networks (RNNs) with Long Short-Term memory cells curre...
research
08/12/2016

When was that made?

In this paper, we explore deep learning methods for estimating when obje...
research
12/17/2015

Continuous online sequence learning with an unsupervised neural network model

The ability to recognize and predict temporal sequences of sensory input...
research
12/22/2016

Handwriting recognition using Cohort of LSTM and lexicon verification with extremely large lexicon

State-of-the-art methods for handwriting recognition are based on Long S...
research
09/11/2021

College Student Retention Risk Analysis From Educational Database using Multi-Task Multi-Modal Neural Fusion

We develop a Multimodal Spatiotemporal Neural Fusion network for Multi-T...

Please sign up or login with your details

Forgot password? Click here to reset