
How to Self-Host an AI Customer Chat Agent with Ollama + ChatterMate (No API Keys Needed)
So you want an AI-powered customer chat widget on your site — but you don't want to send every conversation to OpenAI or pay per token? ChatterMate is an open-source AI chat agent that works seamlessly with Ollama, meaning you can run the entire stack locally. No API keys. No cloud dependency. Full data ownership. In this post, I'll walk through getting ChatterMate running with Ollama on your own server. What you'll need: A Linux server (or Mac/Windows with Docker) Docker and Docker Compose About 8GB RAM for running a decent local model Step 1: Clone the repo git clone https://github.com/chattermate/chattermate.chat.git cd chattermate.chat Step 2: Set up Ollama Install Ollama and pull a model: curl -fsSL https://ollama.com/install.sh | sh ollama pull llama3.2 Step 3: Configure and run Copy the example environment file and configure it to point to your local Ollama instance. Then spin everything up with Docker Compose. What's in v1.0.9 (latest release): Slack integration with OAuth Secu
Continue reading on Dev.to Tutorial
Opens in a new tab



