z-logo
open-access-imgOpen Access
Trimmed categorical cross‐entropy for deep learning with label noise
Author(s) -
Rusiecki A.
Publication year - 2019
Publication title -
electronics letters
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.375
H-Index - 146
ISSN - 1350-911X
DOI - 10.1049/el.2018.7980
Subject(s) - categorical variable , artificial intelligence , entropy (arrow of time) , cross entropy , computer science , noise (video) , pattern recognition (psychology) , speech recognition , statistics , mathematics , machine learning , physics , quantum mechanics , image (mathematics)
Deep learning methods are nowadays considered as state‐of‐the‐art approach in many sophisticated problems, such as computer vision, speech understanding or natural language processing. However, their performance relies on the quality of large annotated datasets. If the data are not well‐annotated and label noise occur, such data‐driven models become less reliable. In this Letter, the authors present very simple way to make the training process robust to noisy labels. Without changing network architecture and learning algorithm, the authors apply modified error measure that improves network generalisation when trained with label noise. Preliminary results obtained for deep convolutional neural networks, trained with novel trimmed categorical cross‐entropy loss function, revealed its improved robustness for several levels of label noise.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here