Interactive game for learning basic Sign Language to enhance the understanding between the hearing and hearing-impaired communities.
Jointly developed by students from Health and Life Sciences Discipline and Engineering Discipline, the project aims to develop an interactive game to encourage more people to learn basic sign language, which can enable enhanced understanding between the hearing and hearing-impaired communities.
Sign language is the natural medium of communication for the Deaf community. There are around 155,000 people with hearing impairment in Hong Kong (2.2% of total population) . According to The Hong Kong Council of Social Service , there are only 56 qualified interpreters on the List of Sign Language Interpreters in Hong Kong. In order to raise social awareness of the importance of sign language, the students have developed the AI Sign Language Game, facilitated by the VTC STEM Education Centre, and hope that everyone can learn some basic phrases of Hong Kong Sign Language for communication with people with hearing impairments.
In this interactive game, the interface guides the player to watch a video demonstration, which requires some familiarity with specific words and phrases before starting the game. The artificial intelligence system using the computer vision technology – real-time system – to extract human body, hand, and facial keypoints (130 keypoints in total) from single images, which will be compared with pre-trained models for image classification and checking accuracy. The framework for sign language recognition uses machine learning: Neural Network and Support Vector Machine classify the keypoints with images in training the dataset.