
Zero-dependency algebraic memory for AI agents, with belief change detection
I built algebraic memory for AI agents — and then taught it to notice when beliefs change Most agent memory is either RAG (semantic search over chunks) or key-value stores. Neither gives you structured fact recall and temporal awareness. I built two libraries that stack: hrr-memory — instant fact recall, no embeddings Stores (subject, relation, object) triples using holographic reduced representations — circular convolution over high-dimensional vectors. Sub-2ms queries, zero dependencies, no vector database. import { HRRMemory } from 'hrr-memory'; const mem = new HRRMemory(); mem.store('alice', 'lives_in', 'paris'); mem.query('alice', 'lives_in'); // → { match: 'paris', confident: true } mem.ask("Where does alice live?"); // → { match: 'paris' } Complements RAG — use HRR for structured facts ("What is Alice's timezone?"), RAG for fuzzy search ("Find notes about deployment"). hrr-memory-obs — detect when facts change Wraps hrr-memory with a timeline, conflict detection, and observation
Continue reading on Dev.to Webdev
Opens in a new tab




