
Your AI Conversations Are Getting Long — I Built a Tool to Distill, Compress, and Score Every Prompt (Open Source)
Every developer using AI coding tools has sessions that spiral. What starts as "add a login page" becomes a 100-turn conversation where 80% of the turns are noise — follow-up corrections, restated requirements, tangential debugging. The 20 turns that actually drove the implementation are buried. I built reprompt to fix this. It started as a prompt scorer, evolved into a compression engine, and is now a conversation distillation tool. Everything runs locally, no LLM needed. Conversation Distillation: 6 Signals, No LLM reprompt distill --last extracts the important turns from your most recent AI session using 6 rule-based signals: Position — first and last turns carry more weight (they frame the task and conclude it) Length — substantive turns are longer than "yes" and "try again" Tool trigger — turns that invoke tool calls (file edits, test runs) drive actual work Error recovery — turns after failures carry debugging context Semantic shift — topic changes indicate new decisions or pivot
Continue reading on Dev.to Python
Opens in a new tab



