
PII-aware routing: how to use cloud AI and keep your sensitive data local
Here's the tension at the heart of every personal AI system: cloud models are better at reasoning, but your data is private. A self-hosted system can run everything locally — but a 2B parameter model on a mini-PC isn't going to draft a nuanced email response or analyze a complex financial situation the way a frontier model can. The naive solutions are both bad. "Send everything to the cloud" means your diary entries, medical notes, and financial records pass through someone else's servers. "Run everything locally" means accepting worse reasoning on tasks where model quality actually matters. We built a third option: a PII-aware routing layer that classifies every piece of data by sensitivity, routes it to the right model, and pseudonymizes anything sensitive that needs cloud reasoning power. The classification: four levels, zero LLM calls Every record in the system gets a sensitivity level. The classification is entirely deterministic — regex patterns and domain rules. No LLM in the cl
Continue reading on Dev.to
Opens in a new tab

.jpg&w=1200&q=75)


