AI technologies have rapidly expanded in clinical practice across North America. They support various applications including medical image analysis, clinical documentation, patient monitoring, administrative tasks, and personalized treatment planning. Investment in AI reflects this growth; the healthcare AI market is expected to increase from $11 billion in 2021 to $187 billion by 2030. This expansion is driven by AI’s ability to quickly analyze large datasets, improve diagnostic accuracy, predict risks, and reduce the workload on clinicians.
An example of AI in clinical use is the ambient AI platform introduced by Duke Health. This system uses natural language processing and ambient clinical intelligence to make documentation easier during patient visits. According to Dr. Eric Poon, Chief Health Information Officer at Duke Health, clinicians using this tool experience less mental effort in documenting encounters, which allows them to concentrate on meaningful patient interactions. The platform is being implemented for 5,000 clinicians across more than 150 clinics in North Carolina, with early reports showing improved clinical efficiency and provider satisfaction.
While AI’s role in assisting clinical functions is clear, integrating it into medical training remains a challenge. Many programs have been slow to adapt, creating a gap between clinician skills and the demands of an AI-enabled clinical environment.
Healthcare administrators and educators agree that future clinicians must be prepared not only to use AI but to understand its capabilities, limitations, and ethical considerations. This type of training is important to ensure AI is used safely and effectively without compromising patient care.
An article published in eClinicalMedicine recommends formal AI education within medical curricula. Authors Tim Schubert and colleagues propose a three-level framework of AI expertise that aligns with stages of clinical training. It starts with basic AI literacy for beginners, moves to intermediate knowledge of AI applications, and ends with expert understanding for clinicians handling complex decisions or AI management. This structure creates a progressive learning path linking daily clinical needs to the sophistication of AI tools.
Medical educators stress that AI education should be interactive and embedded in case-based collaborative learning. Dr. David H. Roberts, Dean of External Education at Harvard Medical School, highlights the need to move away from traditional lectures toward interdisciplinary teaching where students learn to use AI alongside human judgment. The Harvard Macy Institute incorporates generative AI in their programs to help with content summarization, question generation, and tutoring while keeping clinician mentorship central.
Nursing education is also changing. Duke University School of Nursing, led by Associate Professor Dr. Michael Cary, has started initiatives to increase AI literacy among nurses. Dr. Cary points out that current training does not fully prepare nurses to work effectively with AI tools. His programs include workshops on governance, ethical AI use, and integrating AI in decisions. Including nurses in AI education recognizes their key role as the largest healthcare workforce and supports broad adoption throughout care teams.
Shriya Das, MS, MSc, notes that current healthcare training often lacks comprehensive coverage of these complex topics, making it important to develop curricula that include both technical and ethical AI education.
AI integration also affects workflow automation in healthcare. Beyond clinical decision-making, AI is improving front-office and back-office tasks, leading to better administrative efficiency and faster patient access.
Systems like Simbo AI offer AI-powered front-office phone automation that manages patient calls, appointment scheduling, and basic questions. For medical practice owners and administrators, adopting these tools can reduce burdens on reception staff, shorten wait times, and improve patient satisfaction by ensuring timely responses.
Other automation uses include:
Integrating AI-powered workflow automation requires IT managers and administrators to work closely with clinical leaders to select, implement, and evaluate systems. This also means ensuring compatibility with existing electronic health records and training staff on new workflows.
For medical practice administrators and IT professionals, AI adoption involves more than clinician training. It demands careful planning when choosing AI solutions, setting governance frameworks, and addressing security and privacy concerns.
Duke Health’s ABCDS (algorithm-based clinical decision support) governance framework illustrates best practices to make sure AI tools are safe, fair, and effective before wide clinical use. This oversight includes ongoing evaluation of the clinical impact and effects on clinician burnout and job satisfaction.
These governance principles are important for other health systems and private practices aiming to implement AI responsibly. When administrators understand the technical and educational sides of AI, they can make better purchasing decisions, design training programs, and encourage acceptance across their organizations.
Healthcare professionals trained now will work in environments where AI tools are standard. Institutions like Duke Health plan to expose trainees to AI early so they gain familiarity and confidence. This approach aims to build a workforce that uses AI to improve patient care while maintaining human oversight and empathy.
Medical educators and healthcare leaders must work together to keep updating curricula and training to reflect advances in healthcare technology. This includes supporting interdisciplinary learning, ethical concerns, and adaptability as AI tools evolve.
The future of clinical education and healthcare delivery in the U.S. depends on how well institutions balance new technology with traditional clinical skills, making sure AI supports care rather than replaces clinical expertise.
As AI adoption grows in healthcare, medical practice administrators, owners, and IT managers in the U.S. must actively support clinician training and technology use. By understanding AI education and workflow automation needs, healthcare organizations can prepare their workforce, improve clinical operations, and enhance patient care quality in a complex environment.
Abridge has signed deals with several prominent healthcare systems, including Mayo Clinic, Duke Health, and Johns Hopkins, to roll out generative AI-based clinical documentation tools.
The Abridge ambient AI platform will be available to 5,000 clinicians at over 150 primary and specialty clinics within Duke Health.
Clinicians reported feeling more present during patient interactions and could complete their visits more efficiently without the distraction of documenting notes.
No, the adoption of the Abridge platform is optional, as Duke Health leadership wants to ensure clinicians feel comfortable with the new technology.
Duke Health is interested in co-developing additional clinical applications using ambient AI technology, potentially extending its use to various clinical settings.
Duke Health is assessing impacts on clinician burnout, satisfaction with documentation practices, and overall clinician satisfaction and productivity.
Duke Health employs a governance process called ABCDS, which oversees algorithm-based clinical decision support and ensures safe, effective, and equitable AI usage in healthcare.
Yes, there is an intention to expose clinicians in training to AI technologies before they start practicing independently, as these tools become increasingly common.
AI could enhance administrative tasks like revenue cycle management, coding, chart reviews, and drafting communications in patient portals.
Clinician feedback is crucial in the evaluation process and informs decisions on continuing to deploy AI technologies based on their real-world impact.