z-logo
open-access-imgOpen Access
GLStyleNet: exquisite style transfer combining global and local pyramid features
Author(s) -
Wang Zhizhong,
Zhao Lei,
Lin Sihuan,
Mo Qihang,
Zhang Huiming,
Xing Wei,
Lu Dongming
Publication year - 2020
Publication title -
iet computer vision
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.38
H-Index - 37
eISSN - 1751-9640
pISSN - 1751-9632
DOI - 10.1049/iet-cvi.2019.0844
Subject(s) - computer science , style (visual arts) , artificial intelligence , pyramid (geometry) , quality (philosophy) , feature (linguistics) , process (computing) , expression (computer science) , transfer (computing) , artificial neural network , pattern recognition (psychology) , natural language processing , mathematics , linguistics , art , philosophy , geometry , literature , epistemology , parallel computing , programming language , operating system
Recent studies using deep neural networks have shown remarkable success in style transfer, especially for artistic and photo‐realistic images. However, these methods cannot solve more sophisticated problems. The approaches using global statistics fail to capture small, intricate textures and maintain correct texture scales of the artworks, and the others based on local patches are defective on global effect. To address these issues, this study presents a unified model [global and local style network (GLStyleNet)] to achieve exquisite style transfer with higher quality. Specifically, a simple yet effective perceptual loss is proposed to consider the information of global semantic‐level structure, local patch‐level style, and global channel‐level effect at the same time. This could help transfer not just large‐scale, obvious style cues but also subtle, exquisite ones, and dramatically improve the quality of style transfer. Besides, the authors introduce a novel deep pyramid feature fusion module to provide a more flexible style expression and a more efficient transfer process. This could help retain both high‐frequency pixel information and low‐frequency construct information. They demonstrate the effectiveness and superiority of their approach on numerous style transfer tasks, especially the Chinese ancient painting style transfer. Experimental results indicate that their unified approach improves image style transfer quality over previous state‐of‐the‐art methods.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here