
Healthcare AI and HIPAA: Every Clinical Note You Send to ChatGPT Is a Potential Violation
Healthcare AI and HIPAA: The Compliance Gap Nobody Talks About Hospitals are using ChatGPT to summarize discharge notes. Clinicians are pasting medication lists into Claude. Billing teams are running insurance appeals through Gemini. Almost none of them have a Business Associate Agreement with the AI provider. Almost all of them are creating HIPAA exposure. Here's the technical and legal breakdown. The HIPAA Framework (Quick Version) HIPAA's Privacy Rule and Security Rule require that any entity handling Protected Health Information (PHI) must: Limit disclosure — share PHI only as necessary, only with authorized entities Business Associate Agreements (BAAs) — written contracts with any third-party that receives PHI Security safeguards — technical, administrative, and physical controls When you send a clinical note to OpenAI's API: OpenAI receives PHI. Under HIPAA, that makes them a Business Associate. A BAA is required. Does OpenAI Sign BAAs? Yes — but only for specific healthcare ente
Continue reading on Dev.to
Opens in a new tab




