
agent.json: The Missing robots.txt for AI Agents
Your website probably has a robots.txt . It tells search engine crawlers what they can and can't access. Simple, universal, effective. But there's nothing equivalent for AI agents. When a browser agent visits your site, it has zero context about what it can do there. So it dumps your entire DOM into an LLM (85,000+ tokens for a typical page), asks "what should I click?", and hopes for the best. This is expensive, slow, and error-prone. What agent.json does agent.json is a proposed standard that lets websites declare their capabilities for AI agents. A site publishes it at /.well-known/agent.json : { "capabilities" : { "search" : { "selector" : "input#search-box" , "method" : "fill_and_submit" }, "add_to_cart" : { "api" : "/api/cart/add" , "method" : "POST" } } } Instead of parsing the entire page, an agent reads the manifest and knows exactly how to interact. The difference in token cost is 85,000 tokens vs ~200. Why this matters now AI agents are proliferating fast. Claude, GPT, Gemin
Continue reading on Dev.to Webdev
Opens in a new tab


