American Sign Language is a main way many deaf people in the U.S. communicate. Studies show it is the third most common language in the country, after English and Spanish. Many people use ASL every day. For medical offices and hospitals that want to include everyone, it is important to provide clear communication for these patients. This matters during doctor visits, scheduling appointments, telehealth talks, and emergencies.
In the past, AI tools focused mostly on written and spoken languages like English and Spanish. ASL was often left out. Most speech and language technologies cannot understand sign language because it has complex hand shapes, movements, and facial expressions. Not having enough good ASL data has slowed down progress in AI tools for sign language. This makes it hard for healthcare places to communicate well with patients who use ASL.
To help AI understand ASL better, Nvidia worked with the American Society for Deaf Children and Hello Monday on a project called “Signs.” This is a website that collects videos of people doing ASL signs under set conditions. The goal is to get up to 400,000 videos showing 1,000 different signed words.
These videos are carefully checked by fluent signers and professional ASL interpreters. This makes sure the videos are correct and useful for training AI. The project collects videos from people with different skill levels. This helps the AI learn the small differences in how signs look when made by different people.
Signs also has an AI system that gives users feedback right away by watching them sign through a webcam. This helps users fix mistakes and learn as they go.
The project started with about 100 signs to help users get going. It plans to add more signs later. Including different regional ways to sign and slang is also part of the plan, since ASL changes and grows over time.
Checking the accuracy of ASL videos is very important to build AI systems healthcare providers can trust. The American Society for Deaf Children works with expert interpreters and fluent signers to carefully examine each video in the Signs dataset. This makes sure AI learns from correct examples.
Cheri Dowling, the Executive Director of the American Society for Deaf Children, says it is very important for families, especially hearing parents with deaf children, to have tools that professionals have checked. Good accuracy lowers the chance of confusion and builds trust in using technology at hospitals and at home.
High quality data also helps create future AI tools that allow deaf patients to communicate better with doctors. As AI gets better, it can be used for hospital scheduling, telehealth, and daily workflows. This helps remove communication problems nowadays.
Healthcare workers face problems when patients speak ASL. Communication problems can cause mistakes, delays, lower patient satisfaction, and worse health results. Having AI tools that translate ASL correctly can help fix many of these problems.
Front office tasks like booking appointments and answering phones are often where communication breaks down. If staff do not know ASL, they might have trouble making appointments or answering questions. AI services that read ASL signs and change them into text or speech can make these tasks easier. Sometimes these systems work during phone or video calls.
Simbo AI is a company that uses AI to automate phone answering in healthcare. Their system can understand sign language on calls or video chats. This means an interpreter is not always needed right away. This helps hospital managers and IT staff offer better and cheaper communication options.
ASL depends not only on hand and finger movements but also on facial expressions and head movements. These are called non-manual signals. They add important meaning and emotion to the signs. Right now, the Signs project mainly focuses on hand shapes and movements. But it plans to add facial and head signals to improve AI accuracy.
This will help hospitals better understand the full message patients send. Small changes in expression can change the meaning or mood, which matters in a clinic.
The Rochester Institute of Technology’s Center for Accessibility and Inclusion Research is working with this project. They help make the system easier to use for deaf and hard-of-hearing people.
Medical offices are using AI more to improve how they work and talk to patients. AI now helps with scheduling, reminders, and customer service. Adding ASL recognition is a new step in this process.
With the Signs dataset, AI can be trained to understand and turn sign language into text during video or phone calls. This offers new ways for medical staff to communicate with deaf patients better.
AI tools can handle phone answering with ASL recognition. They can catch questions and requests correctly and send them to the right people quickly. These tools also keep records and turn signed requests into tasks in hospital software systems.
Simbo AI’s phone automation is an example of this in action. For clinics with many different patients, using AI to handle front desk communication in ASL helps in two ways:
Making the Signs dataset available to many developers will help create more tools for healthcare. These tools could do real-time ASL translation during telehealth visits, make digital guides with ASL avatars, or provide virtual interpreters in hospitals.
These tools help not only deaf patients but also healthcare workers by lowering the need for expensive outside interpreters. They also help hospitals follow laws like the Americans with Disabilities Act (ADA), which requires good communication access.
Early ASL learning tools for families, like hearing parents of deaf kids, also help. Early communication can lead to better health and development and reduce problems that happen because of miscommunication later on.
The Signs platform invites people with all ASL skill levels to add videos. This helps make the dataset wider and better.
Healthcare providers who want AI tools for ASL recognition can support or use these technologies. When the Signs dataset is fully released, developers will be able to build custom AI programs for different needs.
Hospitals with IT teams that plan ahead can try these systems to improve communication, accessibility, and legal compliance.
Ongoing teamwork between tech companies, advocacy groups like the American Society for Deaf Children, and universities will help make future AI tools that truly meet the needs of deaf patients.
American Sign Language is an important part of communication in health care across the country. Making and checking large ASL video datasets helps create smarter AI that understands ASL well. For medical office managers, owners, and IT leaders, using these tools can improve patient experience, office work, and following the law. Adding AI-based ASL recognition into daily medical work is an important step for fair care and better communication with many kinds of patients in the United States.
ASL is the third most prevalent language in the U.S., yet there are far fewer AI tools with ASL data compared to dominant languages like English and Spanish, highlighting a critical need for accessible ASL technology.
Signs is an interactive web platform that supports ASL learning and accessible AI application development, featuring a 3D avatar to demonstrate signs and AI analysis of webcam footage to provide real-time signing feedback.
Users of any skill level can record themselves signing specific words to help build a validated video dataset, which NVIDIA aims to grow to 400,000 clips representing 1,000 signs.
Most deaf children are born to hearing parents; accessible tools like Signs enable family members to learn ASL early, opening effective communication channels even with very young children.
Fluent ASL users and interpreters validate the dataset to ensure each sign’s accuracy, resulting in a high-quality visual dictionary and reliable teaching tool.
The team plans to integrate facial expressions, head movements, regional sign variations, and slang, enhancing the platform’s ability to capture the full nuance of ASL communication.
They could break down barriers between deaf and hearing communities by enabling real-time AI-powered support, digital human applications, and improved video conferencing tools with ASL support.
Volunteers can record and contribute their signing, expanding the dataset’s diversity and helping to refine the AI models supporting ASL recognition and feedback.
RIT researchers evaluate and improve the user experience of the Signs platform, ensuring it effectively serves deaf and hard-of-hearing users’ accessibility needs.
The dataset is planned for public release later in the year, enabling broader access for developers and researchers to build accessible ASL-related AI technologies.