23 September is International Sign Language Day. The world of sign languages is diverse. An estimated 200 different languages are communicated with the hands. American Sign Language (ASL) is one of the best-known and best-researched forms of sign language. It originated in France but has developed into a language in its own right in the USA and Canada. It is now spoken in many parts of Latin America, Africa and Southeast Asia.
What makes sign language so special? It is a visual language that uses the hands and the whole body. Facial expressions, mouth pictures and small noises complement the hand signs and give the language a tremendous expressive power.
It’s important to recognize AI’s current limitations in recognizing sign language. While digital assistants’ speech recognition is advancing, sign language recognition systems are significantly behind. This disparity results in the exclusion of deaf individuals from many digital services offered by Siri or Alexa. This is where AI can step in to bridge the gap.
Artificial intelligence & sign language.
AI holds the potential to revolutionize the accessibility of sign language. It can dismantle barriers and foster the integration of deaf individuals into social life. However, to fully realize these benefits, it is crucial to continue research efforts and establish ethical guidelines.
AI is already active in the following areas of sign languages:
– Sign recognition: AI systems can recognise and interpret gestures with increasing precision. This enables:
– Translation into text: Signs are translated into written text in real-time, making communication easier.
– Interactive systems: Devices like smartphones or voice assistants can respond to gestures.
– Language acquisition: AI-based learning programmes can support sign language acquisition.
Sign creation by Avatars: AI-generated avatars can perform complex gestures fluently and naturally. This is useful for creating teaching materials and developing virtual sign language teachers.
Translation into sign language: Written text can be translated into signs and displayed by an avatar.
– Accessibility and Subtitling: AI systems can subtitle videos and films in sign language or even translate them ultimately into sign language.
Social interaction: AI-based tools can facilitate communication between hearing and deaf people.
AI-driven sign language translators are revolutionising communication. Deaf people can now access information and hold conversations independently of interpreters. This makes the internet and everyday life barrier-free and inclusive.
However, there are also challenges when combining sign language and artificial intelligence:
- Data quality: Large, high-quality data sets of signs necessary for training AI models still need to be improved.
- Cultural diversity: Sign languages are diverse and culturally differ significantly. AI systems must learn to account for this diversity.
- Nuances and context: Sign language is not just a sequence of hand movements but also includes facial expressions, posture, and context. Recognizing these nuances is a challenge.
- Ethical aspects: There are concerns about privacy, discrimination, and the role of AI in implementing sign language and human communication in general.
Learning sign language: the virtual sign language teacher
AI-controlled apps and online platforms already offer virtual sign language teachers.
How does it work?
– 3D avatars: These platforms use sophisticated 3D avatars that perform the signs fluently and naturally.
– Customised learning paths: The AI adapts the learning process to the student’s progress.
– Real-time feedback: The AI recognises the student’s gestures and gives immediate feedback on whether they have been performed correctly.
Interactive exercises: Various exercises train listening comprehension, speaking, and writing in sign language.