
Streaming AI Responses in Next.js: Claude, OpenAI, and the Vercel AI SDK
Why Most AI Streaming Implementations Break You call the OpenAI or Claude API with stream: true . The response streams. But your UI freezes, chunks arrive garbled, or the stream silently cuts off. Here's the correct way. The Streaming API Route // app/api/chat/route.ts import { NextRequest } from ' next/server ' import Anthropic from ' @anthropic-ai/sdk ' const anthropic = new Anthropic () export async function POST ( req : NextRequest ) { const { messages } = await req . json () const stream = anthropic . messages . stream ({ model : ' claude-sonnet-4-6 ' , max_tokens : 2048 , messages , }) // Return a ReadableStream return new Response ( new ReadableStream ({ async start ( controller ) { for await ( const chunk of stream ) { if ( chunk . type === ' content_block_delta ' && chunk . delta . type === ' text_delta ' ) { const text = chunk . delta . text controller . enqueue ( new TextEncoder (). encode ( text )) } } controller . close () } }), { headers : { ' Content-Type ' : ' text/even
Continue reading on Dev.to
Opens in a new tab