Clinical Decision Support software is any digital system that helps healthcare workers. It gathers and studies medical data. Then, it gives advice or alerts about diagnosis, treatment, or disease prevention. These tools can be simple or complex. For example, some CDS tools do basic math like calculating body mass index (BMI) or blood pressure averages. Others use advanced artificial intelligence to predict patient outcomes or suggest treatments tailored to the patient.
CDS tools often connect to Electronic Health Records (EHRs) or hospital systems. They give real-time help during patient care. Examples include alerts for drug interactions, calculators for dosing, warnings about sepsis, and AI models that predict risks like stroke or addiction.
The FDA watches over clinical decision support tools as part of Software as a Medical Device (SaMD). SaMD means software intended for medical use that works on its own, not as part of a machine. Since 2016, the 21st Century Cures Act clarified how the FDA categorizes and regulates software. It separates software that needs FDA regulation from that which does not.
Whether a CDS tool is regulated depends mainly on its “intended use.” This is how the software is described, marketed, and used. If the software aims to “diagnose,” “treat,” “reduce,” or “prevent” disease, then it usually falls under FDA rules.
Not all CDS programs are medical devices that the FDA must regulate. Section 3060 of the 21st Century Cures Act states that some CDS tools are exempt if they meet four rules:
Tools that pass these rules are considered helpers, not decision-makers, so they do not need FDA approval.
In September 2022, the FDA released final guidance on Clinical Decision Support Software. This update removed some of the earlier rules on when the FDA might choose not to enforce regulations. The FDA now plans to supervise more CDS tools.
One concern is automation bias. This happens when doctors rely too much on software without their own judgment. This worry grows if the software gives one clear treatment choice without explaining why.
If a CDS tool follows known clinical rules and FDA-approved instructions, the FDA is less likely to intervene. But software that strays from accepted practices or gives strict treatment orders without room for review will probably need FDA approval and control.
The FDA and international groups like the International Medical Device Regulators Forum classify SaMD based on risk. The risk level depends on two things:
Most CDS tools fall into low-risk groups (Class I or II). They offer information but let healthcare workers use their judgment. High-risk tools that directly control important treatment or diagnosis parts must follow strict regulations (Class III).
Medical practice managers and IT leaders face many issues with these rules:
The FDA’s scan includes CDS software made by companies and possibly tools made by hospitals. The rules for hospital-made tools are not clear yet.
Hospitals often create and change their own CDS tools to fit their patients. These tools spot conditions like sepsis or opioid use disorders early. Dr. Gary E. Weissman says FDA rules could raise safety but might slow down new ideas and add more paperwork, especially for hospitals with fewer resources.
Right now, many hospitals control their own AI-based CDS without formal standards or expert checks. Working with AI companies might help handle FDA rules but can raise concerns about data privacy and conflicts.
Artificial Intelligence is now part of everyday healthcare work. Even before full FDA rules for AI-based CDS are ready, many clinics use AI to improve their work, especially in offices and admin tasks.
For example, AI phone systems help with patient calls, appointments, reminders, and basic triage. This reduces mistakes and saves time. For healthcare managers, automated phone help means less waiting and better resource use.
Using AI-based CDS with workflow tools can make patient care smoother and faster. But it needs careful work to follow rules about data safety and patient privacy.
In clinics, AI can provide quick alerts on patient risks or treatment advice, based on clear models. Managers must make sure these tools meet FDA rules to avoid problems. Choosing AI tools with proper FDA approval and postmarket checks helps keep care safe.
The FDA pays more attention to AI and machine learning in healthcare. By June 2024, around 950 AI/ML-enabled medical devices got FDA approval. Most (75%) are in radiology. Cardiology has about 11%. In 2023 alone, more than 178 devices were approved.
However, many AI tools used now do not have FDA clearance. Professor Nicholson Price from the University of Michigan says most AI tools are used without formal FDA review, relying on exemptions from the 21st Century Cures Act.
Getting FDA approval needs several steps:
Developers should avoid using open-source or in-house labeling tools that lack security or tracking, as this can hurt approval chances.
Large Language Models (LLMs) like those behind AI chatbots are a new challenge. They often give advice similar to clinical decision support, especially in emergencies. But they usually say they are not for clinical use.
Dr. Gary Weissman’s research shows that LLMs offer advice that could influence medical decisions by doctors or others. This means they may count as medical devices under FDA rules. However, current FDA rules do not fully fit these AI models.
The FDA is working on new ways to regulate these tools, such as:
Medical practice owners and administrators should know the regulatory background when using CDS and AI tools.
For healthcare organizations using Clinical Decision Support software and AI, knowing the FDA’s rules is very important. It helps make sure technology works safely, follows the law, and fits well into clinical work. Hospital leaders, clinic owners, and IT managers in the U.S. should keep up with FDA updates to make the best choices about using CDS tools in their work.
SAMD is software intended for one or more medical purposes that performs these purposes independently, without being part of a hardware medical device. The 21st Century Cures Act updated its classification and regulation.
No, CDS tools are only regulated if they meet four criteria specified by the 21st Century Cures Act; otherwise, they are considered non-device CDS and do not require FDA approval.
1. It cannot analyze medical images or signals. 2. It must display or print medical information. 3. It must support recommendations for diagnosis or treatment. 4. It must allow independent review by healthcare professionals.
The FDA’s involvement in clinical validation of SAMD has been limited, and many approved algorithms are rarely tested in real-world settings.
Their analysis indicated that AI models are often not tested outside their training environments, leading to poorer performance when validated on external data.
CHAI’s AI Action Plan aims to create standardized performance benchmarking and address regulatory gaps for varying risk levels of AI applications in healthcare.
CHAI suggests that high-risk applications, such as diagnostic tools, should undergo stronger oversight, while lower-risk applications should have fewer regulatory requirements.
CHAI’s principles include usefulness, fairness, safety, transparency, and privacy, which guide the development and evaluation of AI in healthcare.
The Applied Model Card provides detailed information about healthcare algorithms, including developer identity, bias mitigation, training data sources, and model limitations.
Continuous monitoring ensures that AI applications remain effective and safe in clinical settings, helping mitigate risks associated with their use in patient care.