
Self-attention Mechanism based Dynamic Fault Diagnosis and Classification for Chemical Processes
Author(s) -
Shuai Chen,
Lin Luo,
Qilei Xia,
Lunjie Wang
Publication year - 2021
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1914/1/012046
Subject(s) - benchmark (surveying) , computer science , feature (linguistics) , pattern recognition (psychology) , encoder , artificial intelligence , convolution (computer science) , representation (politics) , fault (geology) , process (computing) , fault detection and isolation , encoding (memory) , artificial neural network , philosophy , linguistics , geodesy , seismology , politics , geology , political science , law , actuator , geography , operating system
A dynamic fault detection and diagnosis technique based on deep encoder-decoder network with self-attention mechanism is proposed in this paper. Although traditional encoder-decoder networks exhibit capability in extracting the temporal dependencies, the architecture encodes the input sequence into a fixed-length internal representation. This limits the performance of these networks, especially when considering relatively long input sequences. The self-attention mechanism is used to weight the local feature vectors and retain the correlation between the local information of the signal and the process operation state, so as to extract the effective feature vectors. The extracted features are then fed into the bidirectional encoder-decoder network. The resulting deep network is not only generalizing the importance of local temporal feature, but it also allows the interpretable feature representation and classification simultaneously. The experiments on the benchmark Tennessee Eastman process show that the proposed model has better diagnostic performance on receiver operating characteristic (ROC) and precise-recall (PR) curves than the classical diagnostic method based on the long-short term memory (LSTM) network with convolution layers.