Download PDFOpen PDF in browserOccluded Multi-Lingual Offline Handwriting Inpainting Based on Multi-Head Attention and Stacked-LSTM DecoderEasyChair Preprint 969616 pages•Date: February 14, 2023AbstractThe encoder-decoder with attention model has become a common framework to online handwriting recovery. Convolutional Neural Network (CNN) and Long Short Term Memory (LSTM) with attention mechanism are respectively used as encoder and decoder. Inspired by the current success of transformers in many tasks, we introduce in this paper a novel recovery-inpainting framework, named Temporal Order with Multi-head Attention Network and stacked-LSTM decoder (TO-MultiNet), to denoise the corrupted offline handwriting and to obtain its online counterpart signal characterized by dynamic features. First, TO-MultiNet framework is trained to generate the temporal order and the pen velocity from offline handwriting. Then, the obtained model is further used to inpaint the occluded handwriting images. This work is validated by the Beta-GRU recognition system that is applied on Arabic, Latin and Indian On/Off dual handwriting datasets. Experimental results prove the effectiveness of Multi-head attention with stacked-LSTM decoder which increases the quality of the obtained uncorrupted image and improves the recognition rate based on the novel Beta-GRU model. Keyphrases: Beta-elliptic model, GRU, LSTM, Occluded offline handwriting, multi-head attention, transformer
|