Open Access
LSTM Network Classification of Dexterous Individual Finger Movements
Author(s) -
Christopher Millar,
Nazmul Siddique,
Emmett Kerr
Publication year - 2022
Publication title -
journal of advanced computational intelligence and intelligent informatics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.172
H-Index - 20
eISSN - 1343-0130
pISSN - 1883-8014
DOI - 10.20965/jaciii.2022.p0113
Subject(s) - computer science , thumb , artificial intelligence , wearable computer , gesture , signal (programming language) , reduction (mathematics) , pattern recognition (psychology) , artificial neural network , sampling (signal processing) , movement (music) , speech recognition , computer vision , medicine , philosophy , geometry , mathematics , filter (signal processing) , programming language , anatomy , embedded system , aesthetics
Electrical activity is generated in the forearm muscles during muscular contractions that control dexterous movements of a human finger and thumb. Using this electrical activity as an input to train a neural network for the purposes of classifying finger movements is not straightforward. Low cost wearable sensors i.e., a Myo Gesture control armband (www.bynorth.com), generally have a lower sampling rate when compared with medical grade EMG detection systems e.g., 200 Hz vs 2000 Hz. Using sensors such as the Myo coupled with the lower amplitude generated by individual finger movements makes it difficult to achieve high classification accuracy. Low sampling rate makes it challenging to distinguish between large quantities of subtle finger movements when using a single network. This research uses two networks which enables for the reduction in the number of movements in each network that are being classified; in turn improving the classification. This is achieved by developing and training LSTM networks that focus on the extension and flexion signals of the fingers and a separate network that is trained using thumb movement signal data. By following this method, this research have increased classification of the individual finger movements to between 90 and 100%.