z-logo
open-access-imgOpen Access
Real-time Defect Detection for Fast-moving Fabrics on Circular Knitting Machine under Various Illumination Conditions
Author(s) -
Yan-Qin Ni,
Pei-Kai Huang,
Ching-Han Yang,
Chin-Chun Chang,
Wei-Jen Wang,
Deron Liang
Publication year - 2025
Publication title -
ieee access
Language(s) - English
Resource type - Magazines
SCImago Journal Rank - 0.587
H-Index - 127
eISSN - 2169-3536
DOI - 10.1109/access.2025.3593335
Subject(s) - aerospace , bioengineering , communication, networking and broadcast technologies , components, circuits, devices and systems , computing and processing , engineered materials, dielectrics and plasmas , engineering profession , fields, waves and electromagnetics , general topics for engineers , geoscience , nuclear engineering , photonics and electrooptics , power, energy and industry applications , robotics and control systems , signal processing and analysis , transportation
In industrial production, automated inspection methods for circular knitting machines often encounter several challenges. First, the rapid movement of fabrics on these machines makes it difficult for existing fabric defect detection methods to effectively capture and process the motion. Next, due to practical constraints aimed at maintaining high yield rates, collecting sufficient abnormal fabric samples for model training is costly and limited. Furthermore, circular knitting machines typically operate under varying illumination conditions, further complicating the task of accurate fabric defect detection. Additionally, these methods usually fail to identify the cutline patterns that are integral to the design of the fabric and mistake cutlines for v-line defects. Therefore, existing fabric defect detection methods often struggle to balance real-time processing, few-shot learning, and high accuracy under various illumination conditions To address the aforementioned challenges, we adopt a few-shot learning approach and propose a novel real-time fabric defect detection method for circular knitting machines, aiming to achieve high accuracy even under varying illumination conditions. The proposed mechanism consists of two components, the LBUnet and the false alarm filter for cutlines. First, to tackle the challenges of real-time detection, limited training data, and varying illumination conditions, we develop a lightweight semantic segmentation model, LBUnet, which leverages local binary (LB) convolution to effectively handle variable lighting conditions. Next, to address the specific challenge of detecting V-line defects, we propose a false-alarm filtering method that ensures accurate defect identification by utilizing time-series data composed of consecutive segmentation maps generated by LBUnet. Extensive experiments demonstrate that the proposed method delivers both high defect detection accuracy and real-time processing performance for fast-moving fabrics on circular knitting machines under diverse lighting conditions. Specifically, using only LBUnet, our approach achieved an average Mean Intersection over Union (mIoU) of 86.24% with an average processing time of just 4 milliseconds per image. When the false-alarm filtering component was incorporated, the system achieved 100% accuracy in detecting cutlines.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom