
Sending a Million Rows from the Backend: Streaming, Batching, Compression & Protocol Buffers
Sending a Million Rows from the Backend: Streaming, Batching, Compression & Protocol Buffers Your database has a million rows. Your client needs them. What could go wrong? Everything, actually. If you try to serialize a million rows into a single JSON response, you'll blow up your server's memory, timeout the request, make the client wait forever, and then crash the browser trying to parse a 500 MB JSON blob. Sending large datasets is a problem that touches every layer: the database, the serialization format, the transport protocol, compression, and how you structure the data flow. Get any of these wrong and your system falls over. This guide covers every major approach — from pagination basics to gRPC streaming, from backpressure management to Parquet exports. With real code, real benchmarks, and real-world examples. Let's get into it. The Problem: Why One Giant JSON Response Is a Terrible Idea Here's the naive approach: app . get ( ' /api/users ' , async ( req , res ) => { const user
Continue reading on Dev.to
Opens in a new tab




