
How to make your website AI-agent friendly in 30 minutes
AI agents are the new browsers. They're crawling, reading, and trying to interact with your site right now — and over 50% of websites are basically unusable for them . If you want agents to actually work with your site, here's how to fix that in 30 minutes. Step 1: Add llms.txt (5 min) Create a /llms.txt file at your site root. This is a plain-text file that tells AI agents what your site does and how to use it. Think of it as robots.txt , but instead of saying "don't crawl this," you're saying "here's how to understand me." The spec lives at llmstxt.org . Here's what a good one looks like: # Acme API > Acme provides a REST API for managing invoices and payments. ## Docs - [API Reference](/docs/api): Full endpoint documentation - [Authentication](/docs/auth): How to authenticate requests - [Webhooks](/docs/webhooks): Event notification setup ## Key Endpoints - POST /api/invoices - Create a new invoice - GET /api/invoices/{id} - Retrieve invoice details - GET /api/customers - List custo
Continue reading on Dev.to Tutorial
Opens in a new tab



