Attention-based Fully Gated CNN-BGRU for Russian Handwritten Text

08/12/2020
by   Abdelrahman Abdallah, et al.
0

This research approaches the task of handwritten text with attention encoder-decoder networks that are trained on Kazakh and Russian language. We developed a novel deep neural network model based on Fully Gated CNN, supported by Multiple bidirectional GRU and Attention mechanisms to manipulate sophisticated features that achieve 0.045 Character Error Rate (CER), 0.192 Word Error Rate (WER) and 0.253 Sequence Error Rate (SER) for the first test dataset and 0.064 CER, 0.24 WER and 0.361 SER for the second test dataset. Also, we propose fully gated layers by taking the advantage of multiple the output feature from Tahn and input feature, this proposed work achieves better results and We experimented with our model on the Handwritten Kazakh Russian Database (HKR). Our research is the first work on the HKR dataset and demonstrates state-of-the-art results to most of the other existing models.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset