
I Made My Portfolio Site AI-Agent-Ready — Here's Every File I Added
A practical guide to making your website discoverable by AI agents using robots.txt, llms.txt, A2A agent cards, MCP manifests, JSON-LD, and CORS headers. Every file included. Everyone's blocking AI crawlers. I did the opposite. In early 2026 I added seven files and a few HTML tags to my static portfolio site. The goal: make it fully machine-readable so that when AI agents start discovering services, hiring contractors, and making purchasing decisions — my site is already in their index. This isn't speculative. Google's A2A protocol defines how agents discover each other. Anthropic's MCP defines how agents invoke tools. ERC-8004 has 21,000+ agents registered on-chain. The infrastructure is live. The question is whether your site speaks the language. Here's every layer I added, with the actual code. The Stack robots.txt ← "You're welcome here" llms.txt ← "Here's who I am" .well-known/ai.json ← "Here's what I offer" .well-known/agent-card.json ← "Here's how to work with me" (A2A) .well-kn
Continue reading on Dev.to Tutorial
Opens in a new tab




