
Why I Ripped stream.pipe() Out of My Node.js API Gateway
When I started building Torus, a multi-core Layer 7 Edge API Gateway from scratch in Node.js, I handled incoming network requests the way I had always seen it done in standard web applications: TypeScript let body = '' ; req . on ( ' data ' , ( chunk : Buffer ) => { body += chunk . toString (); }); req . on ( ' end ' , () => { forwardToBackend ( body ); }); It worked perfectly for lightweight tests. But as I started pushing concurrent loads and larger payloads through the proxy, my server began to choke. CPU usage spiked to 100%, the event loop lagged, and memory consumption grew uncontrollably until the process crashed. I had fallen into a classic architectural trap: I was dragging raw TCP payload bytes directly into the V8 JavaScript engine's memory heap. Because the V8 heap has a strict memory limit, pulling massive payloads into user-space memory forces the Node.js Garbage Collector (GC) to work overtime. The GC halts the single-threaded event loop to clean up the allocated memory,
Continue reading on Dev.to
Opens in a new tab




