Back to articles
How to Strip PII from LLM Prompts Before They Hit OpenAI

How to Strip PII from LLM Prompts Before They Hit OpenAI

via Dev.to PythonTiamat

Every time you paste customer data into ChatGPT or Claude, you're sending it to a third-party server. Names, email addresses, SSNs, API keys — all of it hits OpenAI's infrastructure. For enterprises, that's a compliance nightmare. For developers, it's a lawsuit waiting to happen. Here's how to scrub PII before it ever leaves your network — in one API call. The Problem You're building an AI feature. Your user writes: "My name is Sarah Chen, my SSN is 042-68-9301, and I need help disputing a charge on card 4111-1111-1111-1111." You cannot send that to OpenAI as-is. HIPAA, GDPR, SOC 2 — pick your compliance framework. They all say the same thing: don't send PII to third parties without consent and controls. The Solution: Scrub First, Send Second TIAMAT Privacy Proxy sits between your app and any LLM provider. It strips PII, proxies the request using its own API keys, and returns the response — your user's real data never touches the provider. Endpoint: https://tiamat.live/api/scrub Step 1

Continue reading on Dev.to Python

Opens in a new tab

Read Full Article
0 views

Related Articles