
Build a Streaming AI Chat with Next.js 16 and OpenAI (No Vercel AI SDK)
Most AI chat tutorials use the Vercel AI SDK. But you don't need it. Here's how to build streaming AI chat from scratch — giving you full control. Why skip the AI SDK? Full control over every byte in the stream Provider agnostic — swap OpenAI for Anthropic by changing one URL Smaller bundle — no extra dependencies The API Route // src/app/api/ai/chat/route.ts import { NextRequest } from " next/server " ; export async function POST ( req : NextRequest ) { const { messages } = await req . json (); const response = await fetch ( " https://api.openai.com/v1/chat/completions " , { method : " POST " , headers : { " Content-Type " : " application/json " , Authorization : `Bearer ${ process . env . OPENAI_API_KEY } ` , }, body : JSON . stringify ({ model : " gpt-4o-mini " , messages , stream : true }), }); const stream = new ReadableStream ({ async start ( controller ) { const reader = response . body ! . getReader (); const decoder = new TextDecoder (); try { while ( true ) { const { done , v
Continue reading on Dev.to Tutorial
Opens in a new tab




