z-logo
open-access-imgOpen Access
Speedy and accurate image super‐resolution via deeply recursive CNN with skip connection and network in network
Author(s) -
Guo Dan,
Niu Yanxiong,
Xie Pengyan
Publication year - 2019
Publication title -
iet image processing
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.401
H-Index - 45
eISSN - 1751-9667
pISSN - 1751-9659
DOI - 10.1049/iet-ipr.2018.5907
Subject(s) - computer science , convolutional neural network , margin (machine learning) , image (mathematics) , artificial intelligence , feature (linguistics) , residual , pattern recognition (psychology) , state (computer science) , feature extraction , algorithm , machine learning , philosophy , linguistics
The single image super‐resolution (SISR) methods based on the deep convolutional neural network (CNN) have recently achieved significant improvements in accuracy, advancing the state of the art. However, these deeper models are computationally expensive and require a large number of parameters. Accordingly, they demand more memory and are unsuitable for on‐chip devices. In this study, a novel SISR method using a deeply recursive CNN with skip connections and a network in network structure is proposed. The deeply recursive CNN with skip connections is adopted for the image feature extraction at both local and global levels. Parallelised 1 × 1 CNNs, usually called a network in network structure, are adopted for image reconstruction. Specifically, recursive learning is utilised to control the number of model parameters needed and residual learning is used to ease the difficulty of training. The proposed method performs favourably against the state‐of‐the‐art methods in terms of computational speed and accuracy. It significantly outperforms the previous methods by a large margin, while demanding far fewer parameters. This model requires less memory and is friendly to on‐chip devices.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here