
Sinhala Sign Language Recognition using Leap Motion and Deep Learning
Author(s) -
Priyantha Kumarawadu,
Mohamad Izzath
Publication year - 2022
Publication title -
journal of artificial intelligence and copsule networks
Language(s) - English
Resource type - Journals
ISSN - 2582-2012
DOI - 10.36548/jaicn.2022.1.004
Subject(s) - computer science , sign language , support vector machine , sign (mathematics) , artificial intelligence , position (finance) , motion (physics) , naive bayes classifier , deep learning , speech recognition , controller (irrigation) , pattern recognition (psychology) , computer vision , mathematics , philosophy , linguistics , finance , agronomy , economics , biology , mathematical analysis
A sign language recognition system for low-resource Sinhala Sign Language using Leap Motion (LM) and Deep Neural Networks (DNN) has been presented in this paper. The study extracts static and dynamic features of hand movements of Sinhala Sign Language (SSL) using a LM controller which acquires the position of the palm, radius of hand sphere and positions of five fingers, and the proposed system is tested with the selected 24 letters and 6 words. The experimental results prove that the proposed DNN model with an average testing accuracy of 89.2% outperforms a Naïve Bayes model with 73.3% testing accuracy and a Support Vector Machine (SVM) based model with 81.2% testing accuracy. Therefore, the proposed system which uses 3D non-contact LM Controller and machine learning model has a great potential to be an affordable solution for people with hearing impairment when they communicate with normal people in their day-to-day life in all service sectors.