Designing Translation Tool: Between Sign Language to Spoken Text on Kinect TIME Series Data Using Dynamic TIME Warping

Zico Pratama Putera • Mila Desi Anasanti • Bagus Priambodo
Journal article SINERGI • 2018

Download full text
(English, 10 pages)


The gesture is one of the most natural and expressive methods for the hearing impaired. Most researchers, however, focus on either static gestures, postures or a small group of dynamic gestures due to the complexity of dynamic gestures. We propose the Kinect Translation Tool to recognize the user's gesture. As a result, the Kinect Translation Tool can be used for bilateral communication with the deaf community. Since real-time detection of a large number of dynamic gestures is taken into account, some efficient algorithms and models are required. The dynamic time warping algorithm is used here to detect and translate the gesture. Kinect Sign Language should translate sign language into written and spoken words. Conversely, people can reply directly with their spoken word, which is converted into literal text together with the animated 3D sign language gestures. The user study, which included several prototypes of the user interface, was carried out with the observation of ten participants who had to gesture and spell the phrases in American Sign Language (ASL). The speech recognition tests for simple phrases have therefore shown good results. The system also recognized the participant's gesture very well during the test. The study suggested that a natural user interface with Microsoft Kinect could be interpreted as a sign language translator for the hearing impaired.





SINERGI is a peer-reviewed international journal published three times a year in February, June, ... see more