z-logo
open-access-imgOpen Access
TDCC: top‐down semantic aggregation for colour constancy
Author(s) -
Li Xiaoqiang,
Zhu Yaqin,
Han Jiayue,
Li Jide,
Lian Huicheng,
Tong Weiqin
Publication year - 2019
Publication title -
iet image processing
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.401
H-Index - 45
eISSN - 1751-9667
pISSN - 1751-9659
DOI - 10.1049/iet-ipr.2019.0480
Subject(s) - standard illuminant , computer science , benchmark (surveying) , artificial intelligence , color constancy , pyramid (geometry) , semantics (computer science) , convolutional neural network , pattern recognition (psychology) , image (mathematics) , scale (ratio) , computer vision , mathematics , geography , geometry , geodesy , programming language , cartography
Images obtained from an illuminated scene often have their original colour contaminated. Colour constancy is a study considering how to restore them. Substantial progress on colour constancy has been made in recent years due to the development of a convolutional neural network (CNN). In a CNN structure, high‐level features contain semantic information while low‐level features show local details. If both are taken into account, they would help achieve a more accurate illuminant estimation. However, previous works paid little attention to the latter for lack of frameworks, which can combine those two kinds of features together. Inspired by the pyramid model, a top‐down network that successively propagates high‐level information to low‐level layers is proposed. This network, named top‐down semantic aggregation for colour constancy (TDCC), takes full advantage of the multi‐scale representations with strong semantics. As a result, objects with intrinsic colours are captured and a better estimation is obtained. Experiments on three benchmark datasets demonstrate that TDCC significantly outperforms state‐of‐the‐art colour constancy methods.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here