
I Built an AI Agent That Calls Me on the Phone
I Built an AI Agent That Calls Me on the Phone How I wired Twilio, Claude, and ElevenLabs into an autonomous agent that picks up the phone when it needs a decision. A few weeks ago, I was describing my AI agent to a friend. I told her he'd gone fully autonomous — that he'd call me at 3am to ask me to reset the gateway if something went down. She thought I was exaggerating. I wasn't. But at that point, the phone-calling part was aspirational. The agent could message me on Telegram, automate browsers, deploy smart contracts, and publish articles across four platforms in a single session. But he couldn't actually call me . So we built it. In one session. Here's exactly how. The Architecture The stack is surprisingly simple once you see it: Phone call → Twilio → ConversationRelay → WebSocket → Your Server ↕ Claude (brain) ↕ ElevenLabs (voice) Twilio handles the telephony — it places and receives actual phone calls. ConversationRelay is Twilio's WebSocket bridge that handles speech-to-text
Continue reading on Dev.to Tutorial
Opens in a new tab




