z-logo
open-access-imgOpen Access
Attention-Based Real Image Restoration
Author(s) -
Saeed Anwar,
Nick Barnes,
Lars Petersson
Publication year - 2021
Publication title -
ieee transactions on neural networks and learning systems
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 2.882
H-Index - 212
eISSN - 2162-2388
pISSN - 2162-237X
DOI - 10.1109/tnnls.2021.3131739
Subject(s) - computing and processing , communication, networking and broadcast technologies , components, circuits, devices and systems , general topics for engineers
Deep convolutional neural networks perform better on images containing spatially invariant degradations, also known as synthetic degradations; however, their performance is limited on real-degraded photographs and requires multiple-stage network modeling. To advance the practicability of restoration algorithms, this article proposes a novel single-stage blind real image restoration network (R²Net) by employing a modular architecture. We use a residual on the residual structure to ease low-frequency information flow and apply feature attention to exploit the channel dependencies. Furthermore, the evaluation in terms of quantitative metrics and visual quality for four restoration tasks, i.e., denoising, super-resolution, raindrop removal, and JPEG compression on 11 real degraded datasets against more than 30 state-of-the-art algorithms, demonstrates the superiority of our R²Net. We also present the comparison on three synthetically generated degraded datasets for denoising to showcase our method's capability on synthetics denoising. The codes, trained models, and results are available on https://github.com/saeed-anwar/R2Net.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here