z-logo
open-access-imgOpen Access
Deep pixel‐to‐pixel network for underwater image enhancement and restoration
Author(s) -
Sun Xin,
Liu Lipeng,
Li Qiong,
Dong Junyu,
Lima Estanislau,
Yin Ruiying
Publication year - 2019
Publication title -
iet image processing
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.401
H-Index - 45
eISSN - 1751-9667
pISSN - 1751-9659
DOI - 10.1049/iet-ipr.2018.5237
Subject(s) - computer science , underwater , image restoration , pixel , artificial intelligence , encoding (memory) , deconvolution , decoding methods , computer vision , noise (video) , filter (signal processing) , convolution (computer science) , noise reduction , image (mathematics) , blind deconvolution , process (computing) , image processing , algorithm , artificial neural network , geography , archaeology , operating system
Turbid underwater environment poses great difficulties for the applications of vision technologies. One of the biggest challenges is the complicated noise distribution of the underwater images due to the serious scattering and absorption. To alleviate this problem, this work proposes a deep pixel‐to‐pixel networks model for underwater image enhancement by designing an encoding–decoding framework. It employs the convolution layers as encoding to filter the noise, while uses deconvolution layers as decoding to recover the missing details and refine the image pixel by pixel. Moreover, skip connection is introduced in the networks model in order to avoid low‐level features losing while accelerating the training process. The model achieves the image enhancement in a self‐adaptive data‐driven way rather than considering the physical environment. Several comparison experiments are carried out with different datasets. Results show that it outperforms the state‐of‐the‐art image restoration methods in underwater image defogging, denoising and colour enhancement.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here