Nearly one in five Americans has hearing loss in both ears. These people face special problems in healthcare, especially as telehealth becomes more common.
Telehealth allows conversations between patients and providers, but often it does not include medically qualified ASL interpreters. These interpreters are very important for clear communication.
Lip reading does not work well.
Even skilled lip readers can only understand about 30% of spoken English correctly.
Many Deaf or hard-of-hearing patients use ASL, which is very different from spoken and written English.
ASL has its own grammar, sentence structure, and facial expressions or head movements that carry meaning.
This makes writing or typing unclear or confusing, especially in medical visits where understanding is very important.
AI-made video captions sometimes help in telehealth, but they are often wrong or unreliable.
Wrong captions can cause misunderstandings that might hurt patient care.
Also, about one-third of American adults find health information hard to understand.
Low health literacy makes it even harder for patients to get health information when they have language problems or hearing difficulties.
All these issues make it difficult for ASL users to get healthcare as easily as hearing patients.
Because of these problems, people see a need for AI tools made especially to support ASL communication in healthcare and other areas.
For example, NVIDIA’s Signs platform shows how AI can help with ASL learning and real-time use.
The Signs platform was made with the American Society for Deaf Children and Hello Monday.
It is a website with a digital library of ASL signs shown by a 3D avatar.
The AI watches users signing through a webcam and gives quick feedback to help them learn and be accurate.
Community volunteers add sign videos to the collection, which will grow to 400,000 clips covering 1,000 signs.
Experts fluent in ASL check these videos to make sure they are correct.
This kind of project helps with the lack of good AI data for ASL.
Until now, languages like English have much more AI data available.
In healthcare, advanced AI might translate between ASL and English as people talk, making things easier and clearer.
Other companies like Migam.ai make AI tools that turn videos of ASL into text and text into ASL videos.
Their AI uses Transformer models that translate smoothly and accurately.
Their team includes Deaf people, ASL interpreters, and AI experts to meet the needs of Deaf users.
These tools have gained support from big companies and public groups, showing interest in using AI for ASL accessibility.
On the technical side, research uses deep learning like convolutional neural networks (CNN) to identify ASL signs with more than 99% accuracy.
One project translates ASL into Nepali text and speech, showing how AI can improve how Deaf people interact with computers worldwide.
Many Deaf children have parents who hear.
These families have to learn ASL quickly to talk with their children.
Platforms like Signs give families easy ways to start learning ASL early, even with babies six to eight months old.
This opens up important chances to talk and develop skills.
In healthcare, early ASL communication helps patients get the right care, understand advice, and feel safe with their doctors.
Family members who can sign lower stress and stop misunderstandings during medical visits.
Medically qualified interpreters are very important in clinics.
But AI tools that help the interpreting process can make communication easier, especially for telehealth visits where interpreters might not be nearby.
For healthcare managers and IT staff, adding AI tools and automation creates many benefits beyond just talking.
AI can help with things like scheduling appointments, sending reminders, signing up patients, and following up—all made easier for Deaf and hard-of-hearing patients.
Simbo AI shows how AI phone systems and answering services can make communication smoother.
By using AI chat agents for regular calls, clinics cut wait times and reduce the need for staff to answer simple questions.
For Deaf patients, these systems can include text and video options with ASL support.
This helps make sure communication works well and is accessible.
During medical visits, AI tools can translate ASL live to spoken or written English.
This helps doctors who do not know ASL communicate better.
These tools also help interpreters by writing down or showing signs, making the work faster and more correct.
Sending medical documents and medication instructions in ASL video formats can help patients understand and follow treatments better.
AI can create video explanations in sign language, so patients do not have to rely only on written papers, which can be hard to read.
In telehealth visits, AI ASL interpreting or avatar assistants can help close communication gaps.
They must follow laws like the ADA and ACA to make sure medical language support is trustworthy and private.
Also, these AI tools can fit into different healthcare provider systems.
This is important because hospitals and clinics work in many different ways.
Customizing AI to fit with current software and workflows helps clinics accept and use the tools successfully.
Making and using AI tools designed for ASL helps reduce the differences Deaf and hard-of-hearing people face in U.S. healthcare.
Real-time communication tools let Deaf patients take part in their care, understand instructions, and make decisions.
This lowers risks of mistakes, wrong diagnoses, or incomplete care caused by communication problems.
Medical administrators and facility owners must invest in tools that meet accessibility laws and help all patients.
AI tools for ASL offer a way to improve communication without always needing in-person interpreters, which helps especially in rural or underserved areas.
Healthcare IT leaders have an important job to check AI tools meet medical standards, fit with current systems, and follow privacy and accessibility rules.
By using AI tools for American Sign Language, healthcare leaders can make communication better, improve care for Deaf patients, and build a more inclusive healthcare system in the U.S.
ASL is the third most prevalent language in the U.S., yet there are far fewer AI tools with ASL data compared to dominant languages like English and Spanish, highlighting a critical need for accessible ASL technology.
Signs is an interactive web platform that supports ASL learning and accessible AI application development, featuring a 3D avatar to demonstrate signs and AI analysis of webcam footage to provide real-time signing feedback.
Users of any skill level can record themselves signing specific words to help build a validated video dataset, which NVIDIA aims to grow to 400,000 clips representing 1,000 signs.
Most deaf children are born to hearing parents; accessible tools like Signs enable family members to learn ASL early, opening effective communication channels even with very young children.
Fluent ASL users and interpreters validate the dataset to ensure each sign’s accuracy, resulting in a high-quality visual dictionary and reliable teaching tool.
The team plans to integrate facial expressions, head movements, regional sign variations, and slang, enhancing the platform’s ability to capture the full nuance of ASL communication.
They could break down barriers between deaf and hearing communities by enabling real-time AI-powered support, digital human applications, and improved video conferencing tools with ASL support.
Volunteers can record and contribute their signing, expanding the dataset’s diversity and helping to refine the AI models supporting ASL recognition and feedback.
RIT researchers evaluate and improve the user experience of the Signs platform, ensuring it effectively serves deaf and hard-of-hearing users’ accessibility needs.
The dataset is planned for public release later in the year, enabling broader access for developers and researchers to build accessible ASL-related AI technologies.