Back to articles
The robots.txt Mistake That's Killing Your AI Search Visibility

The robots.txt Mistake That's Killing Your AI Search Visibility

via Dev.to WebdevNikhil Goyal

There's a good chance your website is invisible to ChatGPT, Perplexity, and every other AI search engine — and the fix takes about 2 minutes. I've been auditing sites for AI readability for the past year, and the single most common issue I find isn't bad content or missing schema. It's robots.txt blocking AI crawlers entirely. The site owner has no idea. They're optimizing content, writing FAQ pages, adding structured data — and none of it matters because the front door is locked. Here's how to check yours and fix it. The 30-second check Run this right now: curl -s https://yoursite.com/robots.txt Now look for any of these bot names in Disallow rules: GPTBot — OpenAI's crawler (powers ChatGPT citations) OAI-SearchBot — OpenAI's search indexer (powers ChatGPT search) ChatGPT-User — fetches pages when a ChatGPT user asks for live info ClaudeBot — Anthropic's training crawler Claude-SearchBot — Anthropic's search indexer (powers Claude's web search) PerplexityBot — Perplexity's search craw

Continue reading on Dev.to Webdev

Opens in a new tab

Read Full Article
4 views

Related Articles