


To summarize the result reached in this thesis, a suggested natural user interface using Microsoft Kinect as a better sign language translator for hearing-impaired people was presentedĮmerging sign languages may be divided into two types: village sign languages and Deaf community sign languages. Village sign languages develop within small communities or villages where transmission is within and between families. They include languages such as Al-Sayyid Bedouin Sign Language (ABSL, Israel), Martha's Vineyard Sign Language (United States), Ban Khor Sign Language (Thailand), Kata Kolok Sign Language (Bali), and Adamarobe Sign Language (Ghana). Deaf community sign languages arise from bringing together unrelated signers of different backgrounds in locations such as cities or schools. In such cases (e.g., Nicaraguan Sign Language and Israeli Sign Language ), language learning takes place in large measure between peers. We assume that the social conditions under which a language develops interact with the development of its linguistic structure. Emerging sign languages are crucial for developing and evaluating such assumptions. Because of their young age, much is known about the social conditions and histories of their communities, and their linguistic development is observable from very early stages. These factors make emerging sign languages a natural laboratory for studying the development of linguistic structure and its interaction with the nature of the language community. The gesture is one of the most natural and expressive methods for the hearing impaired. Most researchers, however, focus on either static gestures, postures or a small group of dynamic gestures due to the complexity of dynamic gestures. We propose the Kinect Translation Tool to recognize the user's gesture. As a result, the Kinect Translation Tool can be used for bilateral communication with the. Since real-time detection of a large number of dynamic gestures is taken into account, some efficient algorithms and models are required. The dynamic time warping algorithm is used here to detect and translate the gesture. Kinect Sign Language should translate sign language into written and spoken words.
