z-logo
open-access-imgOpen Access
Deep quantised portrait matting
Author(s) -
Zhang Zhan,
Wang Yuehai,
Yang Jianyi
Publication year - 2020
Publication title -
iet computer vision
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.38
H-Index - 37
eISSN - 1751-9640
pISSN - 1751-9632
DOI - 10.1049/iet-cvi.2019.0779
Subject(s) - portrait , segmentation , computer science , artificial intelligence , task (project management) , block (permutation group theory) , value (mathematics) , regression , mean squared error , key (lock) , computer vision , pattern recognition (psychology) , machine learning , mathematics , statistics , art , art history , geometry , management , computer security , economics
Portrait matting is of vital importance for many applications such as portrait editing, background replacement, ecommerce demonstration, and augmented reality. The portrait matt can be accessed by predicting the α value of the original picture. Previous deep matting methods usually adopt a segmentation network to tackle portrait matting tasks. However, these traditional methods will introduce unpleasant blemishes in the matting results sometimes. The authors find that the key factor behind this phenomenon is how they model the matting problem. On the one hand, α value predicting can be modelled as a regression task. On the other hand, it can be viewed as a classification task of predicting background or foreground. To solve this problem, they explore different methods to model the nature of the α matting problem and propose a novel quantisation‐based adaption. Their method comes up with an α quantisation loss to achieve multi‐threshold filtering. Furthermore, they apply an α merging block to improve conventional regression methods. With their method, the gradient loss is reduced by 7.53% relatively, with mean square error and sum of absolute difference decreased by 14.7% relatively, leading to a more visually pleasant α matt in several segmentation backbones.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here