z-logo
open-access-imgOpen Access
Residual Attention Fusion Network for Single Image Super-Resolution
Author(s) -
Hao Zhang,
Chuwen Lan,
Zehua Gao
Publication year - 2021
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/2031/1/012013
Subject(s) - residual , computer science , convolutional neural network , artificial intelligence , context (archaeology) , pattern recognition (psychology) , fusion , focus (optics) , image (mathematics) , deep learning , image fusion , algorithm , paleontology , linguistics , philosophy , physics , optics , biology
Recently, a very deep convolutional neural network (CNN) demonstrated influential performance in the field of single image super-resolution (SISR). However, most of the CNN-based methods focus on designing deeper and wider network structures alone and do not use the hierarchical and global features in the input image. Therefore, we proposed a residual attention fusion network (RAFN), which is an improved residual fusion (RF) framework, to effectively extract hierarchical features for use in single-image super-resolution. The proposed framework comprises two residual fusion structures composed of several residual and fusion modules, and a continuous memory mechanism is realized by adding a long and short jump connection. The network focuses on learning more effective features. Furthermore, to maximize the power of the RF framework, we introduced global context attention (GCA) module that can model the global context and capture long-distance dependencies. The final RAFN was constructed by applying the proposed RF framework to the GCA blocks. Extensive experiments showed that the proposed network achieved improved performance in the SISR method with fewer parameters, as compared to the methods proposed in previous studies.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here