z-logo
open-access-imgOpen Access
Sign language recognition using Kinect sensor based on color stream and skeleton points
Author(s) -
Isack Bulugu
Publication year - 2021
Publication title -
tanzania journal of science/tanzania journal of science
Language(s) - English
Resource type - Journals
eISSN - 2507-7961
pISSN - 0856-1761
DOI - 10.4314/tjs.v47i2.32
Subject(s) - discriminative model , sign (mathematics) , computer science , artificial intelligence , sign language , pattern recognition (psychology) , skeleton (computer programming) , feature (linguistics) , computer vision , mathematics , mathematical analysis , linguistics , philosophy , programming language
This paper presents a sign language recognition system based on color stream and skeleton points. Several approaches have been established to address sign language recognition problems. However, most of the previous approaches still have poor recognition accuracy. The proposed approach uses Kinect sensor based on color stream and skeleton points from the depth stream to improved recognition accuracy. Techniques within this approach use hand trajectories and hand shapes in combating sign recognition challenges. Therefore, for a particular sign a representative feature vector is extracted, which consists of hand trajectories and hand shapes. A sparse dictionary learning algorithm, Label Consistent K-SVD (LC-KSVD) is applied to obtain a discriminative dictionary. Based on that, the system was further developed to a new classification approach for better results. The proposed system was fairly evaluated based on 21 sign words including one-handed signs and two-handed signs. It was observed that the proposed system gets high recognition accuracy of 98.25%, and obtained an average accuracy of 95.34% for signer independent recognition. Keywords: Sign language, Color stream, Skeleton points, Kinect sensor, Discriminative dictionary.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here