
Prompt management, RAG, and agents with HazelJS
One starter: typed prompt templates, a live registry, FileStore persistence, RAG, supervisor agents, and AI tasks—all driven by the same prompt system Introduction Managing LLM prompts well is hard: you want versioning, overrides without redeploys, and a single place that RAG, agents, and plain AI tasks all read from. The HazelJS Prompt Starter shows how to do exactly that. Built on HazelJS , @hazeljs/prompts , @hazeljs/rag , and @hazeljs/agent , it gives you a PromptRegistry with typed templates, FileStore persistence, and a REST API to inspect and override any prompt at runtime. RAG answer synthesis, the supervisor agent, worker agents, and four AI tasks (welcome, summarize, sentiment, translate) all use that same registry. In this post we walk through what’s in the starter and how to use it. What’s in the box Feature Description PromptTemplate Typed {variable} rendering with full TypeScript inference PromptRegistry Global prompt store — register, override, version at runtime FileSto
Continue reading on Dev.to JavaScript
Opens in a new tab



