Back to articles
I Built a DistilBERT Classifier That Filters What Your Local LLM Remembers

I Built a DistilBERT Classifier That Filters What Your Local LLM Remembers

via Dev.to PythonErenalp Çetintürk

Most local LLM setups have a memory problem. They either save everything — which means your assistant is cluttered with useless trivia and small talk — or they save nothing at all. I wanted something smarter, so I built MemoryGate . The Problem Imagine you tell your local LLM assistant: "My mom was diagnosed with diabetes last week" "My AWS API key is AKIA..." "The project deadline is Friday" And then later: "What's the weather like?" "Tell me a joke" "What year was the Eiffel Tower built?" All of these turns are treated equally. Most memory systems will either save all of them or none of them. But clearly the first three matter and the last three don't. That gap is what MemoryGate fills. What MemoryGate Does MemoryGate is a three-stage pipeline that runs entirely locally: Stage 1 — Generate training data A local LLM running via LM Studio generates labelled conversation examples. Each example is tagged as either important (label = 1) or not important (label = 0). High importance exampl

Continue reading on Dev.to Python

Opens in a new tab

Read Full Article
6 views

Related Articles