HIPAA is a federal law that sets rules to protect patients’ medical records and personal health information. It requires healthcare providers, insurance plans, and their partners to keep this information private and secure. This includes using methods like encryption, controlling who can access the data, and keeping records of data use. These steps help stop unauthorized access or sharing of sensitive information.
FERPA protects the privacy of student education records. It mainly applies to schools but also matters in healthcare places linked to universities and research centers. These places often handle student data along with health records. FERPA requires that student data be anonymized and that personally identifiable information (PII) be carefully controlled when AI systems work with educational records. Not following FERPA can lead to legal problems and loss of trust.
Healthcare leaders, practice owners, and IT managers must know both sets of rules well before they start using AI tools. If they do not limit data access to only what is needed, they risk serious penalties under HIPAA. Similarly, mishandling student data can cause privacy breaches under FERPA.
An example of AI use in healthcare that focuses on privacy is the work done by UTHealth Houston and OpenAI. This project is one of the first in the United States to use AI technologies following HIPAA and FERPA rules. UTHealth Houston allows its students, teachers, and staff to use OpenAI’s ChatGPT Education tool for healthcare and educational projects while keeping data private.
Amar Yousif, the Chief Information Officer (CIO) at UTHealth Houston, said the goal is to make patient experience better, make operations smoother, and support research, all while following privacy laws. He talked about using “safe and trusted solutions” in the partnership. Xiaoqian Jiang, Chair of the Health Data Science and AI Department at UTHealth, added that this teamwork shows a strong focus on advancing health data science without giving up privacy and security.
From OpenAI’s side, Brad Lightcap, Chief Operating Officer, said it is important to use AI tools that help both clinical work and research while keeping HIPAA and FERPA rules in mind. Together, these leaders showed that teamwork between organizations can handle sensitive data properly and bring useful AI advances to healthcare.
AI can be useful and save time but also brings risks to data safety and privacy. Many AI tools need large sets of data to learn from, which can include too much protected health or personal student information. This might break HIPAA’s rule about using only the data needed for the task. For example, if AI looks at more patient or student data than it should, it can cause unauthorized data leaks.
AI systems also change as they learn from new information. This makes following privacy rules harder. It can be unclear who is responsible if AI causes privacy problems — the company making the AI, the healthcare center, or the users.
To manage these risks, healthcare groups need strict data policies, such as:
Northern Michigan University runs a program that teaches healthcare workers about following HIPAA and FERPA rules. It shows them which AI tasks are allowed (like sending patient reminders without sharing private info) and which are not (like asking AI to get sensitive health or school data without enough protections).
AI systems need strong technical controls to meet privacy rules. Healthcare groups should use:
New technologies like differential privacy and federated learning add more data protection. Differential privacy adds random noise to data to hide individual identities. Federated learning lets AI learn from data stored separately instead of collecting all data in one place.
AI helps not only in patient care but also in everyday healthcare work. AI automation can reduce the work staff needs to do, improve communication, and make managing sensitive data more accurate.
For example, AI phone automation can make patient communication easier. Companies like Simbo AI provide automated phone services to handle appointment scheduling, prescription refills, patient questions, and reminders. These systems keep health information safe and reduce mistakes that happen when people handle everything manually.
Microsoft Copilot, which works with Microsoft 365 apps, helps medical staff by automating paperwork. It can write clinical notes and summarize patient histories. Since doctors spend about one-third of their time on paperwork, this frees them up to care more for patients. Proper setup is needed for full HIPAA compliance with tools like Copilot. This means using encrypted systems, signed business agreements, and regular staff training in data protection.
AI automation also helps with managing supplies, distributing resources, generating compliance reports, and analyzing health trends. These changes help healthcare centers use resources better and meet legal rules more easily.
To keep privacy and follow laws when using AI, healthcare workers need proper training about privacy rules and AI risks. Northern Michigan University offers training modules that last from 16 to 80 minutes. They teach how to use AI prompts that follow rules, why anonymizing data is important, and when to report privacy problems.
AI technology is always changing, so healthcare leaders and IT staff need to keep updating training. Knowing how to use AI tools legally is just as important as the AI technology itself.
Healthcare groups should also clearly assign roles like data security officers, compliance managers, and IT experts to manage AI systems. This makes sure people are responsible and can act quickly if there is a data breach or audit.
Though AI has many benefits, healthcare leaders should be careful when adding AI tools. Important steps include:
The UTHealth Houston and OpenAI partnership shows how healthcare groups can work with AI developers and experts to use AI safely. This teamwork helps healthcare centers protect sensitive information and improve both patient care and administrative work.
Healthcare administrators, practice owners, and IT managers in the United States must be careful when using AI because of HIPAA and FERPA rules. Collaborations like the one between UTHealth Houston and OpenAI show that AI can be used responsibly. AI can help improve operations, reduce mistakes, and make patient engagement better without risking data privacy. Staff training and strong technical safeguards are key to keeping privacy safe in this digital age.
UTHealth Houston has announced a collaboration with OpenAI to integrate AI technology into healthcare and education, becoming the first of its kind in the United States.
The collaboration utilizes OpenAI’s ChatGPT Education tool, which will be made accessible to students, faculty, and staff at UTHealth Houston.
The solutions developed under this collaboration will comply with HIPAA and FERPA regulations, ensuring the protection of sensitive health and educational information.
The initiative aims to improve patient experience, drive innovative research, streamline operations, and enhance data analysis capabilities within the health care system.
Amar Yousif, vice president and CIO, and Xiaoqian Jiang, PhD, professor and chair of the Department of Health Data Science and Artificial Intelligence, represent UTHealth Houston.
This partnership enhances UTHealth Houston’s capabilities to develop AI-driven solutions, impacting healthcare and education while maintaining a strong focus on privacy and security.
Brad Lightcap, OpenAI’s chief operating officer, emphasized the importance of deploying AI for research and clinical work while prioritizing safety and compliance.
By leveraging cutting-edge AI technology, UTHealth aims to enhance its research and clinical practices, setting high standards for innovation in biomedical informatics.
The goals include improving healthcare delivery, fostering innovative research, and providing state-of-the-art analytical capabilities.
The partnership signifies a commitment to innovation and excellence in biomedical informatics, demonstrating the potential impact of AI in health care systems.