How AI helps to master sign language
For any person, body language is difficult to learn. The fact is that not only gestures are used, but also facial expressions, as well as articulation, plus “body language”. The value of the transmitted information depends on all these factors. There are programs for learning sign language, but many of them are not very successful, plus the price tag is not always humane. In the near future, things can change - methods of studying sign language using AI are being developed.
It is about automating the learning process, which is completely intuitive. Here you can . familiarize yourself with the capabilities of the sign language analysis tool (the Swedish-German group) with all the necessary details. Nevertheless, developers are hoping to create AI platforms that can help speed up the learning process at times.
AI has been used and is being used to recognize, translate and interpret sign language. But the possibilities of such projects are incomplete, they do not cover all aspects. The developers of the new system are planning with the help of AI to let the learner know that he doesn’t do so and tell him how to fix it.
Practicing for sign language is difficult, since it is not reading and not writing. For learning using PC developers have created a computer game. She gradually teaches gestures, showing exactly which gesture corresponds to a particular letter or concept. Next, the player must repeat this gesture, recording their actions on the camera. The program will show how successful the repetition was and what needs to be done in order to improve the result. In addition, in this game there is social competition between the participants, which involves users in the process.
Artificial intelligence is involved in all stages of learning. The first step is the interaction with the convolutional neural network - it extracts visual information on the user's actions from the video. These data are sent to the next evaluation module, which compares the result with the standards available in the database and provides its own evaluation.
All this is done in a form understandable to the user, it is clear that the student does not see the whole process. It works only with external manifestations of neural network actions.
In the near future, developers are planning to create systems that will help native speakers of other groups of languages to master sign language. The most difficult thing here is learning a neural network and recording videos with a demonstration of each concept and sound, it takes a long time and this takes time.
In addition, the system will be taught to “understand” the facial expressions of a person’s face with reference to learning sign language. At the moment, the system works optimally in classroom conditions. In the future, the developers plan to make the application with AI more universal, so that it can be used on the street. The creators of the platform are hoping that their product can be used to learn sign language for people of any age.