z-logo
open-access-imgOpen Access
Mixture separability loss in a deep convolutional network for image classification
Author(s) -
Do Trung Dung,
Jin ChengBin,
Nguyen Van Huan,
Kim Hakil
Publication year - 2019
Publication title -
iet image processing
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.401
H-Index - 45
eISSN - 1751-9667
pISSN - 1751-9659
DOI - 10.1049/iet-ipr.2018.5613
Subject(s) - computer science , cross entropy , entropy (arrow of time) , artificial intelligence , class (philosophy) , pattern recognition (psychology) , feature (linguistics) , function (biology) , contextual image classification , image (mathematics) , machine learning , linguistics , philosophy , physics , quantum mechanics , evolutionary biology , biology
In machine learning, the cost function is crucial because it measures how good or bad a system is. In image classification, well‐known networks only consider modifying the network structures and applying cross‐entropy loss at the end of the network. However, using only cross‐entropy loss causes a network to stop updating weights when all training images are correctly classified. This is the problem of early saturation. This study proposes a novel cost function, called mixture separability loss (MSL), which updates the weights of the network even when most of the training images are accurately predicted. MSL consists of between‐class and within‐class loss. Between‐class loss maximises the differences between inter‐class images, whereas within‐class loss minimises the similarities between intra‐class images. They designed the proposed loss function to attach to different convolutional layers in the network in order to utilise intermediate feature maps. Experiments show that a network with MSL deepens the learning process and obtains promising results with some public datasets, such as Street View House Number, Canadian Institute for Advanced Research, and the authors’ self‐collected Inha Computer Vision Lab gender dataset.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here