Back to articles
How to Add Streaming AI Chat to Any Next.js App

How to Add Streaming AI Chat to Any Next.js App

via Dev.toAtlas Whoff

How to Add Streaming AI Chat to Any Next.js App Adding AI chat to a Next.js app has a few non-obvious pieces: streaming responses, client-side rendering of chunks, error handling for API failures, and not exposing your API key to the client. This is the complete pattern. Architecture Client (React) → POST /api/chat → Server (Route Handler) → Claude/OpenAI API → Stream back The API key lives on the server. The client never sees it. Responses stream chunk by chunk for a responsive feel. 1. Route Handler (Server Side) app/api/chat/route.ts : import { NextRequest } from " next/server " ; import Anthropic from " @anthropic-ai/sdk " ; const anthropic = new Anthropic ({ apiKey : process . env . ANTHROPIC_API_KEY ! , }); export async function POST ( req : NextRequest ) { const { messages } = await req . json (); // Validate input if ( ! Array . isArray ( messages ) || messages . length === 0 ) { return new Response ( " Invalid messages " , { status : 400 }); } // Create streaming response cons

Continue reading on Dev.to

Opens in a new tab

Read Full Article
3 views

Related Articles