Research Library

open-access-imgOpen AccessLearning with Noisy Labels: Interconnection of Two Expectation-Maximizations
Author(s)
Heewon Kim,
Hyun Sung Chang,
Kiho Cho,
Jaeyun Lee,
Bohyung Han
Publication year2024
Labor-intensive labeling becomes a bottleneck in developing computer visionalgorithms based on deep learning. For this reason, dealing with imperfectlabels has increasingly gained attention and has become an active field ofstudy. We address learning with noisy labels (LNL) problem, which is formalizedas a task of finding a structured manifold in the midst of noisy data. In thisframework, we provide a proper objective function and an optimization algorithmbased on two expectation-maximization (EM) cycles. The separate networksassociated with the two EM cycles collaborate to optimize the objectivefunction, where one model is for distinguishing clean labels from corruptedones while the other is for refurbishing the corrupted labels. This approachresults in a non-collapsing LNL-flywheel model in the end. Experiments showthat our algorithm achieves state-of-the-art performance in multiple standardbenchmarks with substantial margins under various types of label noise.
Language(s)English

Seeing content that should not be on Zendy? Contact us.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here