Back to articles
agents.txt — a proposed web standard for AI agents

agents.txt — a proposed web standard for AI agents

via Dev.to WebdevJasper van Veen

The web has robots.txt . It's been around since 1994, and it answers one question well: can you look at this? AI agents don't just look. They book flights, submit forms, call APIs, authenticate as users, and transact on behalf of people. And there's no standard for any of it. I've been thinking about this gap for a while, and last week I drafted a proposal: agents.txt . The idea Place a file at https://yourdomain.com/agents.txt . It tells agents what they can do, how to do it, and under what terms: Site-Name: ExampleShop Site-Description: Online marketplace for sustainable home goods. Allow-Training: no Allow-RAG: yes Allow-Actions: no Preferred-Interface: rest API-Docs: https://api.exampleshop.com/openapi.json MCP-Server: https://mcp.exampleshop.com [Agent: *] Allow: /products/* Allow: /search Disallow: /checkout [Agent: verified-purchasing-agent] Allow: /checkout Auth-Required: yes Auth-Method: oauth2 Allow-Actions: yes Why would agents comply? Two reasons: Self-interest. When a site

Continue reading on Dev.to Webdev

Opens in a new tab

Read Full Article
2 views

Related Articles