z-logo
open-access-imgOpen Access
Fully automatic image colorization based on semantic segmentation technology
Author(s) -
Min Xu,
Youdong Ding
Publication year - 2021
Publication title -
plos one
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.99
H-Index - 332
ISSN - 1932-6203
DOI - 10.1371/journal.pone.0259953
Subject(s) - artificial intelligence , computer science , chrominance , pattern recognition (psychology) , segmentation , color image , image segmentation , computer vision , merge (version control) , feature extraction , encoder , feature (linguistics) , image (mathematics) , image processing , luminance , linguistics , philosophy , information retrieval , operating system
Aiming at these problems of image colorization algorithms based on deep learning, such as color bleeding and insufficient color, this paper converts the study of image colorization to the optimization of image semantic segmentation, and proposes a fully automatic image colorization model based on semantic segmentation technology. Firstly, we use the encoder as the local feature extraction network and use VGG-16 as the global feature extraction network. These two parts do not interfere with each other, but they share the low-level feature. Then, the first fusion module is constructed to merge local features and global features, and the fusion results are input into semantic segmentation network and color prediction network respectively. Finally, the color prediction network obtains the semantic segmentation information of the image through the second fusion module, and predicts the chrominance of the image based on it. Through several sets of experiments, it is proved that the performance of our model becomes stronger and stronger under the nourishment of the data. Even in some complex scenes, our model can predict reasonable colors and color correctly, and the output effect is very real and natural.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here