
your local AI agent shouldn't care which model you run
your local AI agent shouldn't care which model you run Most local AI apps that offer agent or coding features lock you into a handful of models. OpenAI function calling? Supported. Anthropic tool use? Supported. That random abliterated Llama finetune you pulled from HuggingFace last Tuesday? Good luck. The core problem is tool calling. It's the mechanism that turns a chatbot into an agent — the model doesn't just generate text, it emits structured calls to functions like "read this file" or "run this shell command." But there's no universal standard for how models express tool calls. OpenAI uses a JSON schema format. Anthropic has its own protocol. Ollama exposes native tool calling for some models but not others. And many local models — especially uncensored, abliterated, or community-finetuned variants — don't support structured tool calling at all. So we built a system in Locally Uncensored (v2.2.3) that makes our Codex coding agent work with any model. Not "any model from our appro
Continue reading on Dev.to
Opens in a new tab



