z-logo
open-access-imgOpen Access
Dual-teacher guided denoising distillation for anomaly detection
Author(s) -
Ning Li,
Ajian Liu,
Chaohao Jiang,
Suigu Tang,
Yongze Li,
Yanyan Liang
Publication year - 2025
Publication title -
ieee access
Language(s) - English
Resource type - Magazines
SCImago Journal Rank - 0.587
H-Index - 127
eISSN - 2169-3536
DOI - 10.1109/access.2025.3616789
Subject(s) - aerospace , bioengineering , communication, networking and broadcast technologies , components, circuits, devices and systems , computing and processing , engineered materials, dielectrics and plasmas , engineering profession , fields, waves and electromagnetics , general topics for engineers , geoscience , nuclear engineering , photonics and electrooptics , power, energy and industry applications , robotics and control systems , signal processing and analysis , transportation
Recently, knowledge distillation-based approaches have demonstrated efficacy in unsupervised anomaly detection. These methods typically compute pointwise feature discrepancies between teacher and student (T–S) to localize anomalies. However, this paradigm suffers from two issues that degrade performance: (1) some normal regions exhibit large feature differences between T-S networks (leading to false detections), while (2) certain anomalous regions produce small differences (causing missed detections). To address this, we propose a Dual-Teacher Guided Denoising (DTGD) distillation framework that reformulates anomaly detection as a noise-removal task, where anomalies are treated as noise. Specifically, the DTGD framework includes a normal teacher, an anomaly teacher, and a denoising encoder-decoder student. The normal teacher encodes pristine normal data, and the anomaly teacher captures potential anomaly features from synthetic anomalies, guiding the student network to retain normal features and expel noise. After each encoder block of the student, our teacher-guided noise removal (TNR) module injects knowledge from the anomaly teacher, explicitly teaching the student which features to preserve. Furthermore, consistency loss and dissimilarity loss functions enforce consistency with the normal teacher in both encoder/decoder outputs and increase dissimilarity of anomaly-related features from the anomaly teacher via pixel-wise supervision. Experiments on anomaly detection datasets demonstrate that DTGD achieves advanced localization accuracy and produces sharper anomaly maps, reducing both false positives and false negatives.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom