Telehealth services grew quickly during and after the COVID-19 pandemic. This created new chances but also new problems for healthcare. Patients with limited English proficiency (LEP), who do not speak English well enough to communicate in healthcare, face language barriers that make care hard to get. Deaf patients also have special communication needs that telehealth must meet to give good care.
About 25 million people in the U.S. speak English less than very well. These patients often have trouble accessing care and explaining their symptoms and concerns. Poor communication can cause wrong diagnoses, delays in treatment, medicine mistakes, and less satisfaction with care. A study showed about 38% of AI transcription errors can cause harmful misunderstandings. This shows how important good language access is in health care.
LEP patients often do not follow their treatment plans well and feel less satisfied when professional interpreters are not available. Nearly 8% of Medicare patients are LEP and usually need interpreters or translated materials to understand their care. Without language help, they may avoid care or leave the hospital early.
Money and technology issues make things harder. Many low-income LEP patients have trouble using digital tools, do not have good internet, or lack privacy for telehealth visits. For example, only 71% of homes earning under $30,000 have smartphones, compared to 95% of homes making over $75,000. These problems make it harder to use language services via telehealth.
Deaf patients face different problems. Sign languages are visual and three-dimensional, but most telehealth video is two-dimensional. Many Deaf patients who use a sign language other than American Sign Language (ASL) need two interpreters: one Certified Deaf Interpreter (CDI) and one ASL interpreter to help with doctors.
Healthcare staff often try to use written notes, lip-reading, or gestures, but these ways do not work well enough. A study in the UAE found more than 70% of Deaf patients had trouble understanding medical procedures. This leads to poor care and delays.
Finding qualified sign language interpreters for telehealth is hard. Scheduling issues, technical problems, and lack of staff training make it worse. Hospitals and clinics need better rules and training to help staff communicate well with Deaf patients. Since hearing loss is expected to grow worldwide, these issues will become more important.
Providing services to LEP and Deaf patients is required by law. Federal laws like Title VI of the Civil Rights Act forbid discrimination based on origin. Healthcare providers that get federal money must make reasonable efforts to provide language access. The Affordable Care Act’s Section 1557 says providers must have human oversight and good communication in all healthcare, including telehealth.
Disability laws like Section 504 of the Rehabilitation Act and the Americans with Disabilities Act require good communication for people with disabilities, including Deaf patients, in telehealth. Providers must use qualified interpreters trained in medical language and telehealth. Family or friends should not be used as interpreters because this can cause mistakes, privacy problems, or bias.
The Mid-Atlantic Telehealth Resource Center (MATRC) says video interpretation services, such as Video Remote Interpretation (VRI), are better than audio-only methods. Video allows patients and providers to see important nonverbal cues. This helps patients understand better and makes communication clearer during telehealth visits.
Medical leaders and IT staff should follow these laws by making language access plans (LAPs). LAPs include:
Health providers working in telehealth should change how they communicate to serve LEP and Deaf patients well. Good practices include:
Telehealth is changing with artificial intelligence (AI) and automation tools to improve efficiency and patient experience. One use is AI-powered phone systems that handle appointments, answer questions, and give help in many languages. This cuts wait times and lets staff focus on harder tasks. For LEP patients, AI can provide quick extra language help with speech recognition and translation.
Still, AI must be used carefully. Studies show AI transcriptions can have errors that cause misunderstandings. AI should not replace human interpreters in important medical talks like diagnoses or consent.
The “human in the loop” model is becoming popular. This means AI handles simple language tasks but humans quickly step in for complex or important talks. For example, Seattle Children’s Hospital uses AI to translate clinical documents, but human interpreters review the work to keep it safe and correct. Other programs show AI and humans can work well together.
Automation helps healthcare groups by reducing the workload on staff, especially when arranging interpreters and managing languages. AI can handle callbacks, route patients by language, and work 24/7 to improve patient satisfaction.
AI tools can be built into telehealth systems to simplify workflow. For example, automated calls can detect language needs and connect patients to the right services or interpreters. Real-time data can also monitor language service quality to help improve over time.
Clinics, hospitals, and private doctors in the U.S. must follow federal language access laws or face lawsuits, fines, and loss of funding. Language problems can cause more hospital readmissions, medicine errors, and preventable health issues, which raise healthcare costs.
Healthcare leaders should focus on a compliance plan that includes:
Telehealth for LEP and Deaf patients will grow as the U.S. becomes more linguistically and culturally diverse. Research and use of AI tools combined with skilled human interpreters will help providers keep care safe and effective.
Healthcare leaders, medical practice owners, and IT managers need to stay updated on new technology, laws, and best practices in language access. Planning investments in language access programs, staff training, and technology will be important for fair, legal, and good-quality telehealth care for all patients.
Medical practice administrators must see language access as a key part of telehealth services. Providing qualified human interpreters, using AI carefully, and following laws will improve patient outcomes, build trust, and keep healthcare fair for everyone.
Telehealth’s rapid expansion has spotlighted the critical challenge of delivering language access at scale, especially for limited-English-proficient (LEP) and Deaf/Hard-of-Hearing patients, leading to risks such as misdiagnoses and reduced patient satisfaction.
AI can automate translation, reduce wait times, and scale multilingual communication, thereby assisting telehealth providers in enhancing patient engagement and communication.
Human oversight is necessary to ensure accuracy in high-stakes medical communication, as misinterpretations can result in harmful misunderstandings, particularly in critical interactions such as diagnoses or informed consent.
Current regulations, particularly Section 1557, mandate human oversight in critical medical communications to prevent non-compliance and patient harm, emphasizing that AI alone cannot replace trained interpreters.
AI lacks cultural sensitivity, which is vital for specialties like mental health. Cultural nuances can lead to misinterpretations, making trained human interpreters essential for effective communication.
The ‘human in the loop’ model suggests combining AI assistance for low-risk tasks with immediate access to human interpreters for high-complexity interactions, ensuring quality and safety in communication.
Examples include Seattle Children’s Hospital’s AI pilot for translating clinical documents, where AI-generated content is reviewed by qualified human translators to improve turnaround times while ensuring safety.
Organizations should start by conducting a needs assessment, choosing HIPAA-compliant AI tools, building staff awareness through training, and monitoring quality and compliance for effective language access.
AI struggles with accuracy, particularly in high-stakes situations, and lacks the cultural sensitivity necessary for nuanced medical conversations, making it inadequate as a sole solution.
Future innovations in telehealth language services may be driven by advancements in AI voice recognition and context modeling, enhancing the capabilities of AI while still requiring a qualified human interpreter for complex cases.