AI systems need large amounts of well-labeled data to learn and get better. In radiology, this means looking through many imaging reports to find certain findings, patterns, or recommendations. AI learns to spot these on its own later.
For example, Northwestern Medicine created an AI system inside electronic health records (EHR) to find unexpected findings in lung and adrenal imaging. This system checked over 460,000 radiology studies in one year. It flagged about 23,000 studies with lung follow-up suggestions. Each flagged report had to be labeled correctly to train the natural language processing (NLP) algorithm. This helped the AI find similar findings in future reports.
Labeling such a huge amount of data takes a lot of time and money if done by outside vendors. These vendors often cost a lot and may give data of varying quality. This can be a big problem in clinical AI where accuracy is very important for patient safety.
Because of this, Northwestern Medicine chose to use their own clinical staff. These included nurses and frontline workers who were on light-duty and could not do their normal clinical jobs temporarily. These workers were trained as annotators to carefully review and label radiology reports. This ensured the data was labeled by experts and also lowered costs. Using staff inside the organization helped keep data quality high and made good use of available people.
Healthcare groups wanting to use clinical staff for radiology data labeling need to think about several things:
Northwestern Medicine’s AI program shows a real-world example useful for many US healthcare groups. They needed to label a huge amount of radiology data. So, they used their own clinical staff on light-duty, like nurses and other frontline team members, to label radiology reports.
These annotators looked at reports about unexpected lung and adrenal findings. They highlighted key words and ideas in the reports. This helped train the AI’s natural language processing model.
This in-house method allowed Northwestern Medicine to:
The program showed that AI helps but does not replace doctors and radiologists. It helps by making workflows smoother and alerting clinicians to act quickly, which lowers the chance of missed follow-ups.
The use of AI in radiology needs more than good data labeling. It also needs ways to add AI alerts and tools directly into clinical workflows. This makes them easier to use and more effective.
Northwestern Medicine’s model shows how AI can be put inside EHR to send alerts called Best Practice Advisories (BPAs). These alerts tell doctors in real time when follow-up imaging is needed for unexpected findings. Doctors can act quickly without leaving their main software, which avoids delays from switching between systems.
AI-driven automation brings these benefits in radiology departments and clinics:
Bringing AI into healthcare means thinking about ethics and rules. These need to be handled for AI to work well and safely.
Healthcare leaders in the US must consider:
Healthcare groups in the US who want to start using AI in radiology or other areas can learn from current examples:
Handling large amounts of radiology data for AI needs healthcare groups to change how they label data. Northwestern Medicine used trained clinical staff to label many radiology reports. This method lowers costs and improves data quality. Adding AI alerts inside electronic health records helps workflows by letting doctors act fast on unexpected findings that could be missed otherwise.
By dealing with operational, ethical, and regulatory needs at the same time, medical centers and hospitals in the US can bring AI safely into radiology. This leads to better diagnosis, safer care, and better health results while using resources wisely. Medical leaders and IT managers need to focus on these methods to use AI well in their organizations.
The article addresses the problem of delayed and missed follow-up on incidental diagnostic imaging findings, which can lead to patient harm and increased healthcare costs.
The AI system uses natural language processing (NLP) integrated with the electronic health record (EHR) to automatically identify radiology reports with incidental findings requiring follow-up and triggers alerts within the physician’s workflow.
The AI facilitates physician decision-making by identifying reports and triggering alerts but does not make clinical decisions, which remain the responsibility of radiologists and ordering physicians.
Physicians receive a Best Practice Advisory (BPA) alert directly within the EHR, which displays findings and provides workflows to order appropriate follow-up studies.
Patients receive notifications through their online portals with study results; if they do not use the portal or have no primary physician, follow-up nurses manage direct outreach to ensure care continuity.
In one year, over 460,000 imaging studies were screened with 23,000 lung findings flagged requiring follow-up, demonstrating the prevalence of incidental findings and effectiveness of the AI alert system in managing follow-ups.
Northwestern Medicine used trained nurses and front-line staff on light-duty to annotate and label relevant radiology report data in-house, ensuring high-quality, expert-reviewed data effectively and cost-efficiently.
A multidisciplinary team from Radiology, Quality, Patient Safety, Process Improvement, Primary Care, Nursing, Informatics, and others collaborated to design and implement the AI follow-up alert system.
Integration ensures alerts appear in the existing physician workflow without requiring additional software access, improving usability and response time to incidental findings.
Yes, the system is being expanded to cover hepatic, thyroid, and ovarian findings requiring follow-up to further reduce missed or delayed care across more conditions.