
I built a universal memory layer that works across every AI tool — here's how it works
Every AI tool I use has its own memory silo. Claude knows I prefer Python. ChatGPT doesn't. Cursor doesn't know what I told Claude yesterday. Every tool starts from scratch. So I built mem-mesh — a local proxy daemon that gives all your AI tools one shared memory. How it works mem-mesh runs a local HTTP server on port 4117. You set one environment variable: export ANTHROPIC_BASE_URL = http://localhost:4117 From that point, every AI API call goes through mem-mesh. It: Pulls your stored memories from ~/.mem-mesh/ Injects them into the system prompt of every request After each response, extracts new memory signals (using Claude Haiku) and stores them What gets remembered mem-mesh categorizes memories into three buckets: preferences — "Prefers Python over JavaScript", "Likes concise answers" facts — "Lives in San Jose", "Works at a startup" corrections — "Function is get_user, not fetch_user" All stored as plain Markdown in ~/.mem-mesh/ . Readable, editable, Git-backable, yours forever. Th
Continue reading on Dev.to
Opens in a new tab




