z-logo
open-access-imgOpen Access
Low‐light image enhancement based on exponential Retinex variational model
Author(s) -
Chen Xinyu,
Li Jinjiang,
Hua Zhen
Publication year - 2021
Publication title -
iet image processing
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.401
H-Index - 45
eISSN - 1751-9667
pISSN - 1751-9659
DOI - 10.1049/ipr2.12287
Subject(s) - color constancy , artificial intelligence , computer vision , computer science , image (mathematics) , image restoration , distortion (music) , texture (cosmology) , contrast (vision) , enhanced data rates for gsm evolution , mathematics , image processing , amplifier , computer network , bandwidth (computing)
Abstract Aiming at the problems of residual noise, low contrast, and limited detail information caused by low‐light images, this paper proposes a new Retinex variational model. According to Retinex theory, it is necessary to estimate the illumination and reflectance components decomposed from the original image. In order to better maintain the edge information, texture richness, and prevent artefacts, the exponential forms of local variation deviation and total variation are used as illumination prior and reflectance prior, respectively, and mixed norms are used to constrain them, so as to deal with the illumination information and texture details of the image more effectively, and then use the bright channel prior to improve the colour reproduction sense of the original image, thereby constructing the objective function, and finally using the alternating iterative optimization method to find the optimal solution to the proposed model. Experiments show that compared with other existing image enhancement methods, the method proposed here improves the contrast of the image, overcomes the phenomenon of halo artefacts and colour distortion, is more consistent with human vision, and produces better results in terms of quantitative performance.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here