Back to articles
Building a Streamable HTTP MCP Server: From stdio to Vercel Serverless
How-ToDevOps

Building a Streamable HTTP MCP Server: From stdio to Vercel Serverless

via Dev.toRafael Silva

The Model Context Protocol (MCP) is rapidly becoming the standard way AI agents discover and use tools. But most MCP servers today use the stdio transport — they run locally and communicate through standard input/output. That's fine for desktop use, but what about cloud deployment? In this post, I'll walk through how I migrated an MCP server from stdio to Streamable HTTP , deployed it on Vercel's free tier, and got it listed on Smithery.ai — all in a single afternoon. The Problem: stdio Doesn't Scale When you build an MCP server with stdio transport, it works great locally: npx -y @anthropic/mcp-server-my-tool But platforms like Smithery.ai, which host and proxy MCP servers for thousands of users, need an HTTP endpoint they can call. The MCP specification defines two remote transports: SSE (Server-Sent Events) — the older approach, being deprecated Streamable HTTP — the new standard (March 2025 spec) The Architecture Decision I had three options: Approach Pros Cons Express + Streamable

Continue reading on Dev.to

Opens in a new tab

Read Full Article
2 views

Related Articles