z-logo
open-access-imgOpen Access
Construction of Scene Tibetan Dataset Based on GAN
Author(s) -
Guowei Zhang,
Weilan Wang,
Penghai Zhao,
Jincheng Li
Publication year - 2021
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1871/1/012130
Subject(s) - inpainting , robustness (evolution) , computer science , artificial intelligence , generalization , image (mathematics) , pattern recognition (psychology) , computer vision , style (visual arts) , natural language processing , mathematics , geography , mathematical analysis , biochemistry , chemistry , gene , archaeology
For the research of Tibetan scene text detection and recognition, it is time-consuming and laborious to collect and annotate natural scene data labels manually. Therefore, artificial synthetic data is of great significance for promoting relevant work. This paper focuses on the study of replacing other languages in the scene with Tibetan, while maintaining the style of the original text. We decompose the problem into three sub-networks: text style transfer network, background inpainting network and fusion network. Firstly, the text in the foreground image is replaced by the text style transfer network to generate the foreground image. Then the background inpainting network erases the original text in the style image, and uses the surrounding information around the text to fill the text area to generate the background image. Finally, the generated foreground image and background image are used to generate the target image by the fusion network. We experimented with conversions from English to Tibetan and English to English to verify the generalization and robustness of the network. Experimental results show that its accuracy (SSIM, PSNR) on some datasets (SVT, ICDAR 2013) has been improved to some extent.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here