
Translation of Gesture-Based Static Sign Language to Text and Speech
Author(s) -
Sabitha Gauni,
Ankit Bastia,
B Sohan Kumar,
Prakhar Soni,
Vineeth Pydi
Publication year - 2021
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1964/6/062074
Subject(s) - gesture , sign language , computer science , sign (mathematics) , facial expression , speech recognition , linguistics , population , communication , natural language processing , psychology , artificial intelligence , sociology , mathematics , mathematical analysis , philosophy , demography
As human beings, most of us convey our thoughts by speech and facial expressions, but according to the latest survey conducted, it was found that roughly 1% of the population in India is deaf and mute. These people communicate with others using hand gestures and facial expressions. However, most people find it difficult to understand gestures. To eliminate this gap, we develop static gesture classification based on sign language standards and then converting to text and speech of a given local dialect.