z-logo
open-access-imgOpen Access
Towards Deep Style Transfer: A Content-Aware Perspective
Author(s) -
Yilei Chen,
Chiou-Ting Hsu
Publication year - 2016
Language(s) - English
Resource type - Conference proceedings
DOI - 10.5244/c.30.8
Subject(s) - perspective (graphical) , style (visual arts) , computer science , content (measure theory) , transfer (computing) , artificial intelligence , history , mathematics , parallel computing , mathematical analysis , archaeology
Modern research has demonstrated that many eye-catching images can be generated by style transfer via deep neural network. There is, however, a dearth of research on content-aware style transfer. In this paper, we generalize the neural algorithm for style transfer from two perspectives: where to transfer and what to transfer. To specify where to transfer, we propose a simple yet effective strategy, named masking out, to constrain the transfer layout. To illustrate what to transfer, we define a new style feature by high-order statistics to better characterize content coherency. Without resorting to additional local matching or MRF models, the proposed method embeds the desired content information, either semantic-aware or saliency-aware, into the original framework seamlessly. Experimental results show that our method is applicable to various types of style transfers and can be extended to image inpainting.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom