Back to articles
I built an open-source tool that stops personal data from leaking into AI chatbots

I built an open-source tool that stops personal data from leaking into AI chatbots

via Dev.to PythonTwisted-Code'r

Ever copy-pasted something into ChatGPT and immediately thought "wait, should I have done that?" If you're building an AI app that handles user data, you need to know what's leaking into your LLM API before a regulator does. That's the problem ShadowAudit solves. It sits between your app and any LLM API and scans every prompt before it leaves your system — catching emails, phone numbers, API keys, and Indian national IDs like Aadhaar and PAN numbers. Two lines to integrate: sa = ShadowAudit . from_config ( " shadowaudit.yaml " ) client = sa . wrap ( openai . OpenAI ()) That's it. Everything else stays the same. It also generates GDPR Article 30 compliance reports automatically from your audit log — one command, done. Built this over summer as part of my open-source portfolio. Would love feedback from the community. GitHub: github.com/Jeffrin-dev/ShadowAudit

Continue reading on Dev.to Python

Opens in a new tab

Read Full Article
2 views

Related Articles